00:00:00.001 Started by upstream project "autotest-per-patch" build number 126182 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.010 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.010 The recommended git tool is: git 00:00:00.011 using credential 00000000-0000-0000-0000-000000000002 00:00:00.012 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.027 Fetching changes from the remote Git repository 00:00:00.029 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.052 Using shallow fetch with depth 1 00:00:00.052 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.052 > git --version # timeout=10 00:00:00.074 > git --version # 'git version 2.39.2' 00:00:00.074 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.118 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.118 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.358 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.372 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.385 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.385 > git config core.sparsecheckout # timeout=10 00:00:02.399 > git read-tree -mu HEAD # timeout=10 00:00:02.417 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.445 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.445 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.539 [Pipeline] Start of Pipeline 00:00:02.556 [Pipeline] library 00:00:02.558 Loading library shm_lib@master 00:00:02.558 Library shm_lib@master is cached. Copying from home. 00:00:02.575 [Pipeline] node 00:00:02.590 Running on WFP51 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.592 [Pipeline] { 00:00:02.605 [Pipeline] catchError 00:00:02.607 [Pipeline] { 00:00:02.620 [Pipeline] wrap 00:00:02.631 [Pipeline] { 00:00:02.639 [Pipeline] stage 00:00:02.641 [Pipeline] { (Prologue) 00:00:02.812 [Pipeline] sh 00:00:03.093 + logger -p user.info -t JENKINS-CI 00:00:03.112 [Pipeline] echo 00:00:03.114 Node: WFP51 00:00:03.120 [Pipeline] sh 00:00:03.414 [Pipeline] setCustomBuildProperty 00:00:03.433 [Pipeline] echo 00:00:03.435 Cleanup processes 00:00:03.442 [Pipeline] sh 00:00:03.726 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.726 4017146 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.737 [Pipeline] sh 00:00:04.016 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.016 ++ grep -v 'sudo pgrep' 00:00:04.016 ++ awk '{print $1}' 00:00:04.016 + sudo kill -9 00:00:04.016 + true 00:00:04.030 [Pipeline] cleanWs 00:00:04.040 [WS-CLEANUP] Deleting project workspace... 00:00:04.040 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.046 [WS-CLEANUP] done 00:00:04.050 [Pipeline] setCustomBuildProperty 00:00:04.063 [Pipeline] sh 00:00:04.339 + sudo git config --global --replace-all safe.directory '*' 00:00:04.416 [Pipeline] httpRequest 00:00:04.430 [Pipeline] echo 00:00:04.431 Sorcerer 10.211.164.101 is alive 00:00:04.437 [Pipeline] httpRequest 00:00:04.440 HttpMethod: GET 00:00:04.441 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.442 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.443 Response Code: HTTP/1.1 200 OK 00:00:04.444 Success: Status code 200 is in the accepted range: 200,404 00:00:04.444 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.725 [Pipeline] sh 00:00:06.006 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:06.023 [Pipeline] httpRequest 00:00:06.053 [Pipeline] echo 00:00:06.054 Sorcerer 10.211.164.101 is alive 00:00:06.062 [Pipeline] httpRequest 00:00:06.067 HttpMethod: GET 00:00:06.067 URL: http://10.211.164.101/packages/spdk_9cede6267a64dd2a5ba05a728370e06a035ce449.tar.gz 00:00:06.068 Sending request to url: http://10.211.164.101/packages/spdk_9cede6267a64dd2a5ba05a728370e06a035ce449.tar.gz 00:00:06.088 Response Code: HTTP/1.1 200 OK 00:00:06.089 Success: Status code 200 is in the accepted range: 200,404 00:00:06.090 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_9cede6267a64dd2a5ba05a728370e06a035ce449.tar.gz 00:01:03.078 [Pipeline] sh 00:01:03.357 + tar --no-same-owner -xf spdk_9cede6267a64dd2a5ba05a728370e06a035ce449.tar.gz 00:01:05.894 [Pipeline] sh 00:01:06.168 + git -C spdk log --oneline -n5 00:01:06.168 9cede6267 test/check_so_deps: Enforce release build (non-debug) when requested 00:01:06.168 6151edad3 test/check_so_deps: Simplify check_header_filenames() 00:01:06.168 44e72e4e7 autopackage: Rename autopackage.sh to release_build.sh 00:01:06.168 255871c19 autopackage: Move core of the script to autobuild 00:01:06.168 bd4841ef7 autopackage: Replace SPDK_TEST_RELEASE_BUILD with SPDK_TEST_PACKAGING 00:01:06.180 [Pipeline] } 00:01:06.197 [Pipeline] // stage 00:01:06.206 [Pipeline] stage 00:01:06.208 [Pipeline] { (Prepare) 00:01:06.228 [Pipeline] writeFile 00:01:06.244 [Pipeline] sh 00:01:06.522 + logger -p user.info -t JENKINS-CI 00:01:06.535 [Pipeline] sh 00:01:06.814 + logger -p user.info -t JENKINS-CI 00:01:06.826 [Pipeline] sh 00:01:07.105 + cat autorun-spdk.conf 00:01:07.105 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:07.105 SPDK_TEST_BLOCKDEV=1 00:01:07.105 SPDK_TEST_ISAL=1 00:01:07.105 SPDK_TEST_CRYPTO=1 00:01:07.105 SPDK_TEST_REDUCE=1 00:01:07.105 SPDK_TEST_VBDEV_COMPRESS=1 00:01:07.105 SPDK_RUN_UBSAN=1 00:01:07.111 RUN_NIGHTLY=0 00:01:07.143 [Pipeline] readFile 00:01:07.171 [Pipeline] withEnv 00:01:07.173 [Pipeline] { 00:01:07.187 [Pipeline] sh 00:01:07.466 + set -ex 00:01:07.467 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:07.467 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:07.467 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:07.467 ++ SPDK_TEST_BLOCKDEV=1 00:01:07.467 ++ SPDK_TEST_ISAL=1 00:01:07.467 ++ SPDK_TEST_CRYPTO=1 00:01:07.467 ++ SPDK_TEST_REDUCE=1 00:01:07.467 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:07.467 ++ SPDK_RUN_UBSAN=1 00:01:07.467 ++ RUN_NIGHTLY=0 00:01:07.467 + case $SPDK_TEST_NVMF_NICS in 00:01:07.467 + DRIVERS= 00:01:07.467 + [[ -n '' ]] 00:01:07.467 + exit 0 00:01:07.473 [Pipeline] } 00:01:07.488 [Pipeline] // withEnv 00:01:07.494 [Pipeline] } 00:01:07.507 [Pipeline] // stage 00:01:07.518 [Pipeline] catchError 00:01:07.519 [Pipeline] { 00:01:07.534 [Pipeline] timeout 00:01:07.534 Timeout set to expire in 40 min 00:01:07.536 [Pipeline] { 00:01:07.551 [Pipeline] stage 00:01:07.553 [Pipeline] { (Tests) 00:01:07.570 [Pipeline] sh 00:01:07.848 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:07.849 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:07.849 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:07.849 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:07.849 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:07.849 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:07.849 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:07.849 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:07.849 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:07.849 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:07.849 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:07.849 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:07.849 + source /etc/os-release 00:01:07.849 ++ NAME='Fedora Linux' 00:01:07.849 ++ VERSION='38 (Cloud Edition)' 00:01:07.849 ++ ID=fedora 00:01:07.849 ++ VERSION_ID=38 00:01:07.849 ++ VERSION_CODENAME= 00:01:07.849 ++ PLATFORM_ID=platform:f38 00:01:07.849 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:07.849 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:07.849 ++ LOGO=fedora-logo-icon 00:01:07.849 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:07.849 ++ HOME_URL=https://fedoraproject.org/ 00:01:07.849 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:07.849 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:07.849 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:07.849 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:07.849 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:07.849 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:07.849 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:07.849 ++ SUPPORT_END=2024-05-14 00:01:07.849 ++ VARIANT='Cloud Edition' 00:01:07.849 ++ VARIANT_ID=cloud 00:01:07.849 + uname -a 00:01:07.849 Linux spdk-wfp-51 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:07.849 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:11.126 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:01:11.126 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:01:11.126 Hugepages 00:01:11.126 node hugesize free / total 00:01:11.126 node0 1048576kB 0 / 0 00:01:11.126 node0 2048kB 0 / 0 00:01:11.126 node1 1048576kB 0 / 0 00:01:11.126 node1 2048kB 0 / 0 00:01:11.126 00:01:11.126 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:11.126 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:11.126 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:11.126 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:11.126 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:11.126 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:11.126 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:11.126 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:11.126 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:11.126 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:01:11.126 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:11.126 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:11.126 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:11.126 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:11.126 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:11.126 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:11.126 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:11.126 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:11.126 VMD 0000:85:05.5 8086 201d 1 - - - 00:01:11.126 VMD 0000:ae:05.5 8086 201d 1 - - - 00:01:11.126 + rm -f /tmp/spdk-ld-path 00:01:11.126 + source autorun-spdk.conf 00:01:11.126 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.126 ++ SPDK_TEST_BLOCKDEV=1 00:01:11.126 ++ SPDK_TEST_ISAL=1 00:01:11.126 ++ SPDK_TEST_CRYPTO=1 00:01:11.126 ++ SPDK_TEST_REDUCE=1 00:01:11.126 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.126 ++ SPDK_RUN_UBSAN=1 00:01:11.126 ++ RUN_NIGHTLY=0 00:01:11.126 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:11.126 + [[ -n '' ]] 00:01:11.126 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:11.385 + for M in /var/spdk/build-*-manifest.txt 00:01:11.385 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:11.385 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:11.385 + for M in /var/spdk/build-*-manifest.txt 00:01:11.385 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:11.385 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:11.385 ++ uname 00:01:11.385 + [[ Linux == \L\i\n\u\x ]] 00:01:11.385 + sudo dmesg -T 00:01:11.385 + sudo dmesg --clear 00:01:11.385 + dmesg_pid=4018129 00:01:11.385 + sudo dmesg -Tw 00:01:11.385 + [[ Fedora Linux == FreeBSD ]] 00:01:11.385 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.385 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.385 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:11.385 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:11.385 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:11.385 + [[ -x /usr/src/fio-static/fio ]] 00:01:11.385 + export FIO_BIN=/usr/src/fio-static/fio 00:01:11.385 + FIO_BIN=/usr/src/fio-static/fio 00:01:11.385 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:11.385 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:11.385 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:11.385 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.385 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.385 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:11.385 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.385 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.385 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:11.385 Test configuration: 00:01:11.385 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.385 SPDK_TEST_BLOCKDEV=1 00:01:11.385 SPDK_TEST_ISAL=1 00:01:11.385 SPDK_TEST_CRYPTO=1 00:01:11.385 SPDK_TEST_REDUCE=1 00:01:11.385 SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.385 SPDK_RUN_UBSAN=1 00:01:11.385 RUN_NIGHTLY=0 13:23:58 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:11.385 13:23:58 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:11.385 13:23:58 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:11.385 13:23:58 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:11.385 13:23:58 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.385 13:23:58 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.385 13:23:58 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.385 13:23:58 -- paths/export.sh@5 -- $ export PATH 00:01:11.385 13:23:58 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.385 13:23:58 -- common/autobuild_common.sh@472 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:11.385 13:23:58 -- common/autobuild_common.sh@473 -- $ date +%s 00:01:11.385 13:23:58 -- common/autobuild_common.sh@473 -- $ mktemp -dt spdk_1721042638.XXXXXX 00:01:11.385 13:23:58 -- common/autobuild_common.sh@473 -- $ SPDK_WORKSPACE=/tmp/spdk_1721042638.riO981 00:01:11.385 13:23:58 -- common/autobuild_common.sh@475 -- $ [[ -n '' ]] 00:01:11.385 13:23:58 -- common/autobuild_common.sh@479 -- $ '[' -n '' ']' 00:01:11.385 13:23:58 -- common/autobuild_common.sh@482 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:11.385 13:23:58 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:11.385 13:23:58 -- common/autobuild_common.sh@488 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:11.385 13:23:58 -- common/autobuild_common.sh@489 -- $ get_config_params 00:01:11.385 13:23:58 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:11.385 13:23:58 -- common/autotest_common.sh@10 -- $ set +x 00:01:11.644 13:23:59 -- common/autobuild_common.sh@489 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:11.644 13:23:59 -- common/autobuild_common.sh@491 -- $ start_monitor_resources 00:01:11.644 13:23:59 -- pm/common@17 -- $ local monitor 00:01:11.644 13:23:59 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.644 13:23:59 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.644 13:23:59 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.644 13:23:59 -- pm/common@21 -- $ date +%s 00:01:11.644 13:23:59 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:11.644 13:23:59 -- pm/common@21 -- $ date +%s 00:01:11.644 13:23:59 -- pm/common@25 -- $ sleep 1 00:01:11.644 13:23:59 -- pm/common@21 -- $ date +%s 00:01:11.644 13:23:59 -- pm/common@21 -- $ date +%s 00:01:11.644 13:23:59 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042639 00:01:11.644 13:23:59 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042639 00:01:11.644 13:23:59 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042639 00:01:11.644 13:23:59 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042639 00:01:11.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042639_collect-vmstat.pm.log 00:01:11.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042639_collect-cpu-load.pm.log 00:01:11.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042639_collect-cpu-temp.pm.log 00:01:11.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042639_collect-bmc-pm.bmc.pm.log 00:01:12.577 13:24:00 -- common/autobuild_common.sh@492 -- $ trap stop_monitor_resources EXIT 00:01:12.577 13:24:00 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:12.577 13:24:00 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:12.577 13:24:00 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:12.577 13:24:00 -- spdk/autobuild.sh@16 -- $ date -u 00:01:12.577 Mon Jul 15 11:24:00 AM UTC 2024 00:01:12.577 13:24:00 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:12.577 v24.09-pre-207-g9cede6267 00:01:12.577 13:24:00 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:12.577 13:24:00 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:12.577 13:24:00 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:12.577 13:24:00 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:12.577 13:24:00 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:12.577 13:24:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:12.577 ************************************ 00:01:12.577 START TEST ubsan 00:01:12.577 ************************************ 00:01:12.577 13:24:00 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:12.577 using ubsan 00:01:12.577 00:01:12.577 real 0m0.000s 00:01:12.577 user 0m0.000s 00:01:12.577 sys 0m0.000s 00:01:12.577 13:24:00 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:12.577 13:24:00 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:12.577 ************************************ 00:01:12.577 END TEST ubsan 00:01:12.577 ************************************ 00:01:12.577 13:24:00 -- common/autotest_common.sh@1142 -- $ return 0 00:01:12.577 13:24:00 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:12.577 13:24:00 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:12.577 13:24:00 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:12.577 13:24:00 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:12.577 13:24:00 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:12.577 13:24:00 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:12.577 13:24:00 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:12.577 13:24:00 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:12.577 13:24:00 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:12.835 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:12.835 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:13.092 Using 'verbs' RDMA provider 00:01:26.661 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:41.522 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:41.522 Creating mk/config.mk...done. 00:01:41.522 Creating mk/cc.flags.mk...done. 00:01:41.522 Type 'make' to build. 00:01:41.522 13:24:28 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:41.522 13:24:28 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:41.522 13:24:28 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:41.522 13:24:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.522 ************************************ 00:01:41.522 START TEST make 00:01:41.522 ************************************ 00:01:41.522 13:24:28 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:41.522 make[1]: Nothing to be done for 'all'. 00:02:13.615 The Meson build system 00:02:13.615 Version: 1.3.1 00:02:13.616 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:13.616 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:13.616 Build type: native build 00:02:13.616 Program cat found: YES (/usr/bin/cat) 00:02:13.616 Project name: DPDK 00:02:13.616 Project version: 24.03.0 00:02:13.616 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:13.616 C linker for the host machine: cc ld.bfd 2.39-16 00:02:13.616 Host machine cpu family: x86_64 00:02:13.616 Host machine cpu: x86_64 00:02:13.616 Message: ## Building in Developer Mode ## 00:02:13.616 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:13.616 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:13.616 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:13.616 Program python3 found: YES (/usr/bin/python3) 00:02:13.616 Program cat found: YES (/usr/bin/cat) 00:02:13.616 Compiler for C supports arguments -march=native: YES 00:02:13.616 Checking for size of "void *" : 8 00:02:13.616 Checking for size of "void *" : 8 (cached) 00:02:13.616 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:13.616 Library m found: YES 00:02:13.616 Library numa found: YES 00:02:13.616 Has header "numaif.h" : YES 00:02:13.616 Library fdt found: NO 00:02:13.616 Library execinfo found: NO 00:02:13.616 Has header "execinfo.h" : YES 00:02:13.616 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:13.616 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:13.616 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:13.616 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:13.616 Run-time dependency openssl found: YES 3.0.9 00:02:13.616 Run-time dependency libpcap found: YES 1.10.4 00:02:13.616 Has header "pcap.h" with dependency libpcap: YES 00:02:13.616 Compiler for C supports arguments -Wcast-qual: YES 00:02:13.616 Compiler for C supports arguments -Wdeprecated: YES 00:02:13.616 Compiler for C supports arguments -Wformat: YES 00:02:13.616 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:13.616 Compiler for C supports arguments -Wformat-security: NO 00:02:13.616 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:13.616 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:13.616 Compiler for C supports arguments -Wnested-externs: YES 00:02:13.616 Compiler for C supports arguments -Wold-style-definition: YES 00:02:13.616 Compiler for C supports arguments -Wpointer-arith: YES 00:02:13.616 Compiler for C supports arguments -Wsign-compare: YES 00:02:13.616 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:13.616 Compiler for C supports arguments -Wundef: YES 00:02:13.616 Compiler for C supports arguments -Wwrite-strings: YES 00:02:13.616 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:13.616 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:13.616 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:13.616 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:13.616 Program objdump found: YES (/usr/bin/objdump) 00:02:13.616 Compiler for C supports arguments -mavx512f: YES 00:02:13.616 Checking if "AVX512 checking" compiles: YES 00:02:13.616 Fetching value of define "__SSE4_2__" : 1 00:02:13.616 Fetching value of define "__AES__" : 1 00:02:13.616 Fetching value of define "__AVX__" : 1 00:02:13.616 Fetching value of define "__AVX2__" : 1 00:02:13.616 Fetching value of define "__AVX512BW__" : 1 00:02:13.616 Fetching value of define "__AVX512CD__" : 1 00:02:13.616 Fetching value of define "__AVX512DQ__" : 1 00:02:13.616 Fetching value of define "__AVX512F__" : 1 00:02:13.616 Fetching value of define "__AVX512VL__" : 1 00:02:13.616 Fetching value of define "__PCLMUL__" : 1 00:02:13.616 Fetching value of define "__RDRND__" : 1 00:02:13.616 Fetching value of define "__RDSEED__" : 1 00:02:13.616 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:13.616 Fetching value of define "__znver1__" : (undefined) 00:02:13.616 Fetching value of define "__znver2__" : (undefined) 00:02:13.616 Fetching value of define "__znver3__" : (undefined) 00:02:13.616 Fetching value of define "__znver4__" : (undefined) 00:02:13.616 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:13.616 Message: lib/log: Defining dependency "log" 00:02:13.616 Message: lib/kvargs: Defining dependency "kvargs" 00:02:13.616 Message: lib/telemetry: Defining dependency "telemetry" 00:02:13.616 Checking for function "getentropy" : NO 00:02:13.616 Message: lib/eal: Defining dependency "eal" 00:02:13.616 Message: lib/ring: Defining dependency "ring" 00:02:13.616 Message: lib/rcu: Defining dependency "rcu" 00:02:13.616 Message: lib/mempool: Defining dependency "mempool" 00:02:13.616 Message: lib/mbuf: Defining dependency "mbuf" 00:02:13.616 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:13.616 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.616 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.616 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:13.616 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:13.616 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:13.616 Compiler for C supports arguments -mpclmul: YES 00:02:13.616 Compiler for C supports arguments -maes: YES 00:02:13.616 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:13.616 Compiler for C supports arguments -mavx512bw: YES 00:02:13.616 Compiler for C supports arguments -mavx512dq: YES 00:02:13.616 Compiler for C supports arguments -mavx512vl: YES 00:02:13.616 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:13.616 Compiler for C supports arguments -mavx2: YES 00:02:13.616 Compiler for C supports arguments -mavx: YES 00:02:13.616 Message: lib/net: Defining dependency "net" 00:02:13.616 Message: lib/meter: Defining dependency "meter" 00:02:13.616 Message: lib/ethdev: Defining dependency "ethdev" 00:02:13.616 Message: lib/pci: Defining dependency "pci" 00:02:13.616 Message: lib/cmdline: Defining dependency "cmdline" 00:02:13.616 Message: lib/hash: Defining dependency "hash" 00:02:13.616 Message: lib/timer: Defining dependency "timer" 00:02:13.616 Message: lib/compressdev: Defining dependency "compressdev" 00:02:13.616 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:13.616 Message: lib/dmadev: Defining dependency "dmadev" 00:02:13.616 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:13.616 Message: lib/power: Defining dependency "power" 00:02:13.616 Message: lib/reorder: Defining dependency "reorder" 00:02:13.616 Message: lib/security: Defining dependency "security" 00:02:13.616 Has header "linux/userfaultfd.h" : YES 00:02:13.616 Has header "linux/vduse.h" : YES 00:02:13.616 Message: lib/vhost: Defining dependency "vhost" 00:02:13.616 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:13.616 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:13.616 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:13.616 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:13.616 Compiler for C supports arguments -std=c11: YES 00:02:13.616 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:13.616 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:13.616 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:13.616 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:13.616 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:13.616 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:13.616 Library mtcr_ul found: NO 00:02:13.616 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:13.616 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:14.552 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:14.553 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:14.553 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:14.553 Configuring mlx5_autoconf.h using configuration 00:02:14.553 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:14.553 Run-time dependency libcrypto found: YES 3.0.9 00:02:14.553 Library IPSec_MB found: YES 00:02:14.553 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:14.553 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:14.553 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:14.553 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:14.553 Library IPSec_MB found: YES 00:02:14.553 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:14.553 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:14.553 Compiler for C supports arguments -std=c11: YES (cached) 00:02:14.553 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:14.553 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:14.553 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:14.553 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:14.553 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:14.553 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:14.553 Library libisal found: NO 00:02:14.553 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:14.553 Compiler for C supports arguments -std=c11: YES (cached) 00:02:14.553 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:14.553 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:14.553 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:14.553 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:14.553 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:14.553 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:14.553 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:14.553 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:14.553 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:14.553 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:14.553 Program doxygen found: YES (/usr/bin/doxygen) 00:02:14.553 Configuring doxy-api-html.conf using configuration 00:02:14.553 Configuring doxy-api-man.conf using configuration 00:02:14.553 Program mandb found: YES (/usr/bin/mandb) 00:02:14.553 Program sphinx-build found: NO 00:02:14.553 Configuring rte_build_config.h using configuration 00:02:14.553 Message: 00:02:14.553 ================= 00:02:14.553 Applications Enabled 00:02:14.553 ================= 00:02:14.553 00:02:14.553 apps: 00:02:14.553 00:02:14.553 00:02:14.553 Message: 00:02:14.553 ================= 00:02:14.553 Libraries Enabled 00:02:14.553 ================= 00:02:14.553 00:02:14.553 libs: 00:02:14.553 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:14.553 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:14.553 cryptodev, dmadev, power, reorder, security, vhost, 00:02:14.553 00:02:14.553 Message: 00:02:14.553 =============== 00:02:14.553 Drivers Enabled 00:02:14.553 =============== 00:02:14.553 00:02:14.553 common: 00:02:14.553 mlx5, qat, 00:02:14.553 bus: 00:02:14.553 auxiliary, pci, vdev, 00:02:14.553 mempool: 00:02:14.553 ring, 00:02:14.553 dma: 00:02:14.553 00:02:14.553 net: 00:02:14.553 00:02:14.553 crypto: 00:02:14.553 ipsec_mb, mlx5, 00:02:14.553 compress: 00:02:14.553 isal, mlx5, 00:02:14.553 vdpa: 00:02:14.553 00:02:14.553 00:02:14.553 Message: 00:02:14.553 ================= 00:02:14.553 Content Skipped 00:02:14.553 ================= 00:02:14.553 00:02:14.553 apps: 00:02:14.553 dumpcap: explicitly disabled via build config 00:02:14.553 graph: explicitly disabled via build config 00:02:14.553 pdump: explicitly disabled via build config 00:02:14.553 proc-info: explicitly disabled via build config 00:02:14.553 test-acl: explicitly disabled via build config 00:02:14.553 test-bbdev: explicitly disabled via build config 00:02:14.553 test-cmdline: explicitly disabled via build config 00:02:14.553 test-compress-perf: explicitly disabled via build config 00:02:14.553 test-crypto-perf: explicitly disabled via build config 00:02:14.553 test-dma-perf: explicitly disabled via build config 00:02:14.553 test-eventdev: explicitly disabled via build config 00:02:14.553 test-fib: explicitly disabled via build config 00:02:14.553 test-flow-perf: explicitly disabled via build config 00:02:14.553 test-gpudev: explicitly disabled via build config 00:02:14.553 test-mldev: explicitly disabled via build config 00:02:14.553 test-pipeline: explicitly disabled via build config 00:02:14.553 test-pmd: explicitly disabled via build config 00:02:14.553 test-regex: explicitly disabled via build config 00:02:14.553 test-sad: explicitly disabled via build config 00:02:14.553 test-security-perf: explicitly disabled via build config 00:02:14.553 00:02:14.553 libs: 00:02:14.553 argparse: explicitly disabled via build config 00:02:14.553 metrics: explicitly disabled via build config 00:02:14.553 acl: explicitly disabled via build config 00:02:14.553 bbdev: explicitly disabled via build config 00:02:14.553 bitratestats: explicitly disabled via build config 00:02:14.553 bpf: explicitly disabled via build config 00:02:14.553 cfgfile: explicitly disabled via build config 00:02:14.553 distributor: explicitly disabled via build config 00:02:14.553 efd: explicitly disabled via build config 00:02:14.553 eventdev: explicitly disabled via build config 00:02:14.553 dispatcher: explicitly disabled via build config 00:02:14.553 gpudev: explicitly disabled via build config 00:02:14.553 gro: explicitly disabled via build config 00:02:14.553 gso: explicitly disabled via build config 00:02:14.553 ip_frag: explicitly disabled via build config 00:02:14.553 jobstats: explicitly disabled via build config 00:02:14.553 latencystats: explicitly disabled via build config 00:02:14.554 lpm: explicitly disabled via build config 00:02:14.554 member: explicitly disabled via build config 00:02:14.554 pcapng: explicitly disabled via build config 00:02:14.554 rawdev: explicitly disabled via build config 00:02:14.554 regexdev: explicitly disabled via build config 00:02:14.554 mldev: explicitly disabled via build config 00:02:14.554 rib: explicitly disabled via build config 00:02:14.554 sched: explicitly disabled via build config 00:02:14.554 stack: explicitly disabled via build config 00:02:14.554 ipsec: explicitly disabled via build config 00:02:14.554 pdcp: explicitly disabled via build config 00:02:14.554 fib: explicitly disabled via build config 00:02:14.554 port: explicitly disabled via build config 00:02:14.554 pdump: explicitly disabled via build config 00:02:14.554 table: explicitly disabled via build config 00:02:14.554 pipeline: explicitly disabled via build config 00:02:14.554 graph: explicitly disabled via build config 00:02:14.554 node: explicitly disabled via build config 00:02:14.554 00:02:14.554 drivers: 00:02:14.554 common/cpt: not in enabled drivers build config 00:02:14.554 common/dpaax: not in enabled drivers build config 00:02:14.554 common/iavf: not in enabled drivers build config 00:02:14.554 common/idpf: not in enabled drivers build config 00:02:14.554 common/ionic: not in enabled drivers build config 00:02:14.554 common/mvep: not in enabled drivers build config 00:02:14.554 common/octeontx: not in enabled drivers build config 00:02:14.554 bus/cdx: not in enabled drivers build config 00:02:14.554 bus/dpaa: not in enabled drivers build config 00:02:14.554 bus/fslmc: not in enabled drivers build config 00:02:14.554 bus/ifpga: not in enabled drivers build config 00:02:14.554 bus/platform: not in enabled drivers build config 00:02:14.554 bus/uacce: not in enabled drivers build config 00:02:14.554 bus/vmbus: not in enabled drivers build config 00:02:14.554 common/cnxk: not in enabled drivers build config 00:02:14.554 common/nfp: not in enabled drivers build config 00:02:14.554 common/nitrox: not in enabled drivers build config 00:02:14.554 common/sfc_efx: not in enabled drivers build config 00:02:14.554 mempool/bucket: not in enabled drivers build config 00:02:14.554 mempool/cnxk: not in enabled drivers build config 00:02:14.554 mempool/dpaa: not in enabled drivers build config 00:02:14.554 mempool/dpaa2: not in enabled drivers build config 00:02:14.554 mempool/octeontx: not in enabled drivers build config 00:02:14.554 mempool/stack: not in enabled drivers build config 00:02:14.554 dma/cnxk: not in enabled drivers build config 00:02:14.554 dma/dpaa: not in enabled drivers build config 00:02:14.554 dma/dpaa2: not in enabled drivers build config 00:02:14.554 dma/hisilicon: not in enabled drivers build config 00:02:14.554 dma/idxd: not in enabled drivers build config 00:02:14.554 dma/ioat: not in enabled drivers build config 00:02:14.554 dma/skeleton: not in enabled drivers build config 00:02:14.554 net/af_packet: not in enabled drivers build config 00:02:14.554 net/af_xdp: not in enabled drivers build config 00:02:14.554 net/ark: not in enabled drivers build config 00:02:14.554 net/atlantic: not in enabled drivers build config 00:02:14.554 net/avp: not in enabled drivers build config 00:02:14.554 net/axgbe: not in enabled drivers build config 00:02:14.554 net/bnx2x: not in enabled drivers build config 00:02:14.554 net/bnxt: not in enabled drivers build config 00:02:14.554 net/bonding: not in enabled drivers build config 00:02:14.554 net/cnxk: not in enabled drivers build config 00:02:14.554 net/cpfl: not in enabled drivers build config 00:02:14.554 net/cxgbe: not in enabled drivers build config 00:02:14.554 net/dpaa: not in enabled drivers build config 00:02:14.554 net/dpaa2: not in enabled drivers build config 00:02:14.554 net/e1000: not in enabled drivers build config 00:02:14.554 net/ena: not in enabled drivers build config 00:02:14.554 net/enetc: not in enabled drivers build config 00:02:14.554 net/enetfec: not in enabled drivers build config 00:02:14.554 net/enic: not in enabled drivers build config 00:02:14.554 net/failsafe: not in enabled drivers build config 00:02:14.554 net/fm10k: not in enabled drivers build config 00:02:14.554 net/gve: not in enabled drivers build config 00:02:14.554 net/hinic: not in enabled drivers build config 00:02:14.554 net/hns3: not in enabled drivers build config 00:02:14.554 net/i40e: not in enabled drivers build config 00:02:14.554 net/iavf: not in enabled drivers build config 00:02:14.554 net/ice: not in enabled drivers build config 00:02:14.554 net/idpf: not in enabled drivers build config 00:02:14.554 net/igc: not in enabled drivers build config 00:02:14.554 net/ionic: not in enabled drivers build config 00:02:14.554 net/ipn3ke: not in enabled drivers build config 00:02:14.554 net/ixgbe: not in enabled drivers build config 00:02:14.554 net/mana: not in enabled drivers build config 00:02:14.554 net/memif: not in enabled drivers build config 00:02:14.554 net/mlx4: not in enabled drivers build config 00:02:14.554 net/mlx5: not in enabled drivers build config 00:02:14.554 net/mvneta: not in enabled drivers build config 00:02:14.554 net/mvpp2: not in enabled drivers build config 00:02:14.554 net/netvsc: not in enabled drivers build config 00:02:14.554 net/nfb: not in enabled drivers build config 00:02:14.554 net/nfp: not in enabled drivers build config 00:02:14.554 net/ngbe: not in enabled drivers build config 00:02:14.554 net/null: not in enabled drivers build config 00:02:14.554 net/octeontx: not in enabled drivers build config 00:02:14.554 net/octeon_ep: not in enabled drivers build config 00:02:14.554 net/pcap: not in enabled drivers build config 00:02:14.554 net/pfe: not in enabled drivers build config 00:02:14.554 net/qede: not in enabled drivers build config 00:02:14.554 net/ring: not in enabled drivers build config 00:02:14.554 net/sfc: not in enabled drivers build config 00:02:14.554 net/softnic: not in enabled drivers build config 00:02:14.554 net/tap: not in enabled drivers build config 00:02:14.554 net/thunderx: not in enabled drivers build config 00:02:14.554 net/txgbe: not in enabled drivers build config 00:02:14.554 net/vdev_netvsc: not in enabled drivers build config 00:02:14.554 net/vhost: not in enabled drivers build config 00:02:14.554 net/virtio: not in enabled drivers build config 00:02:14.554 net/vmxnet3: not in enabled drivers build config 00:02:14.554 raw/*: missing internal dependency, "rawdev" 00:02:14.554 crypto/armv8: not in enabled drivers build config 00:02:14.554 crypto/bcmfs: not in enabled drivers build config 00:02:14.554 crypto/caam_jr: not in enabled drivers build config 00:02:14.554 crypto/ccp: not in enabled drivers build config 00:02:14.554 crypto/cnxk: not in enabled drivers build config 00:02:14.554 crypto/dpaa_sec: not in enabled drivers build config 00:02:14.554 crypto/dpaa2_sec: not in enabled drivers build config 00:02:14.554 crypto/mvsam: not in enabled drivers build config 00:02:14.554 crypto/nitrox: not in enabled drivers build config 00:02:14.554 crypto/null: not in enabled drivers build config 00:02:14.554 crypto/octeontx: not in enabled drivers build config 00:02:14.554 crypto/openssl: not in enabled drivers build config 00:02:14.554 crypto/scheduler: not in enabled drivers build config 00:02:14.554 crypto/uadk: not in enabled drivers build config 00:02:14.554 crypto/virtio: not in enabled drivers build config 00:02:14.554 compress/nitrox: not in enabled drivers build config 00:02:14.554 compress/octeontx: not in enabled drivers build config 00:02:14.554 compress/zlib: not in enabled drivers build config 00:02:14.554 regex/*: missing internal dependency, "regexdev" 00:02:14.554 ml/*: missing internal dependency, "mldev" 00:02:14.554 vdpa/ifc: not in enabled drivers build config 00:02:14.554 vdpa/mlx5: not in enabled drivers build config 00:02:14.554 vdpa/nfp: not in enabled drivers build config 00:02:14.554 vdpa/sfc: not in enabled drivers build config 00:02:14.554 event/*: missing internal dependency, "eventdev" 00:02:14.554 baseband/*: missing internal dependency, "bbdev" 00:02:14.554 gpu/*: missing internal dependency, "gpudev" 00:02:14.554 00:02:14.554 00:02:15.171 Build targets in project: 115 00:02:15.171 00:02:15.171 DPDK 24.03.0 00:02:15.171 00:02:15.171 User defined options 00:02:15.171 buildtype : debug 00:02:15.171 default_library : shared 00:02:15.171 libdir : lib 00:02:15.171 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:15.171 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:15.171 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:15.171 cpu_instruction_set: native 00:02:15.171 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:15.171 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:02:15.171 enable_docs : false 00:02:15.171 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:15.171 enable_kmods : false 00:02:15.171 max_lcores : 128 00:02:15.171 tests : false 00:02:15.171 00:02:15.171 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:15.511 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:15.511 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:15.511 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:15.511 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:15.511 [4/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:15.511 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:15.511 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:15.511 [7/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:15.511 [8/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:15.511 [9/378] Linking static target lib/librte_kvargs.a 00:02:15.511 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:15.511 [11/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:15.511 [12/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:15.511 [13/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:15.511 [14/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:15.511 [15/378] Linking static target lib/librte_log.a 00:02:15.511 [16/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:15.511 [17/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:15.511 [18/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:15.511 [19/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:16.082 [20/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:16.082 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:16.082 [22/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.082 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:16.082 [24/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:16.082 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:16.082 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:16.082 [27/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:16.082 [28/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:16.082 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:16.082 [30/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:16.082 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:16.082 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:16.082 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:16.082 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:16.082 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:16.082 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:16.082 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:16.082 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:16.082 [39/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:16.082 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:16.082 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:16.082 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:16.082 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:16.082 [44/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:16.082 [45/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:16.082 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:16.082 [47/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:16.082 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:16.082 [49/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:16.082 [50/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:16.082 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:16.082 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:16.082 [53/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:16.082 [54/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:16.082 [55/378] Linking static target lib/librte_telemetry.a 00:02:16.082 [56/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:16.082 [57/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:16.082 [58/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:16.082 [59/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:16.082 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:16.082 [61/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:16.082 [62/378] Linking static target lib/librte_ring.a 00:02:16.082 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:16.082 [64/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:16.082 [65/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:16.082 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:16.082 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:16.082 [68/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:16.082 [69/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:16.082 [70/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:16.082 [71/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:16.082 [72/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:16.082 [73/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:16.082 [74/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:16.082 [75/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:16.082 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:16.082 [77/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:16.082 [78/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:16.082 [79/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:16.082 [80/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:16.082 [81/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:16.082 [82/378] Linking static target lib/librte_pci.a 00:02:16.082 [83/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:16.082 [84/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:16.082 [85/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:16.082 [86/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:16.082 [87/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:16.082 [88/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:16.082 [89/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:16.082 [90/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:16.082 [91/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:16.082 [92/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:16.082 [93/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:16.082 [94/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:16.082 [95/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:16.082 [96/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:16.082 [97/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:16.343 [98/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:16.343 [99/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:16.343 [100/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:16.343 [101/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:16.343 [102/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:16.343 [103/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:16.343 [104/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:16.343 [105/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:16.343 [106/378] Linking static target lib/librte_mempool.a 00:02:16.343 [107/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:16.343 [108/378] Linking static target lib/librte_rcu.a 00:02:16.343 [109/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:16.343 [110/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:16.343 [111/378] Linking static target lib/librte_eal.a 00:02:16.343 [112/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:16.343 [113/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:16.343 [114/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.343 [115/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:16.343 [116/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:16.343 [117/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:16.343 [118/378] Linking static target lib/librte_net.a 00:02:16.343 [119/378] Linking target lib/librte_log.so.24.1 00:02:16.343 [120/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:16.343 [121/378] Linking static target lib/librte_meter.a 00:02:16.605 [122/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.605 [123/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.605 [124/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:16.605 [125/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:16.605 [126/378] Linking static target lib/librte_mbuf.a 00:02:16.605 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:16.606 [128/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:16.606 [129/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:16.606 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:16.606 [131/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:16.606 [132/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:16.606 [133/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:16.606 [134/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:16.606 [135/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:16.606 [136/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:16.606 [137/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:16.606 [138/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:16.606 [139/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:16.606 [140/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:16.606 [141/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:16.606 [142/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:16.606 [143/378] Linking static target lib/librte_cmdline.a 00:02:16.606 [144/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:16.606 [145/378] Linking target lib/librte_kvargs.so.24.1 00:02:16.606 [146/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:16.606 [147/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:16.606 [148/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:16.606 [149/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:16.606 [150/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:16.606 [151/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.606 [152/378] Linking static target lib/librte_timer.a 00:02:16.606 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:16.606 [154/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:16.606 [155/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:16.606 [156/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:16.868 [157/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:16.868 [158/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:16.868 [159/378] Linking static target lib/librte_dmadev.a 00:02:16.868 [160/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:16.868 [161/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:16.868 [162/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.868 [163/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:16.868 [164/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:16.868 [165/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:16.868 [166/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:16.868 [167/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:16.868 [168/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:16.868 [169/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:16.868 [170/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:16.868 [171/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.868 [172/378] Linking static target lib/librte_compressdev.a 00:02:16.868 [173/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:16.868 [174/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.868 [175/378] Linking static target lib/librte_power.a 00:02:16.868 [176/378] Linking target lib/librte_telemetry.so.24.1 00:02:16.868 [177/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:16.868 [178/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:16.868 [179/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:16.868 [180/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:16.868 [181/378] Linking static target lib/librte_reorder.a 00:02:16.868 [182/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:16.868 [183/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:16.868 [184/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:16.868 [185/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:16.868 [186/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:16.868 [187/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:16.868 [188/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:16.868 [189/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:16.868 [190/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:16.868 [191/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:16.868 [192/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:16.868 [193/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:16.868 [194/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:16.868 [195/378] Linking static target lib/librte_security.a 00:02:17.132 [196/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:17.132 [197/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:17.132 [198/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:17.132 [199/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:17.132 [200/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:17.132 [201/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:17.132 [202/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:17.132 [203/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:17.132 [204/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:17.132 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:17.132 [206/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:17.132 [207/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:17.132 [208/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:17.132 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:17.132 [210/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:17.132 [211/378] Linking static target drivers/librte_bus_vdev.a 00:02:17.132 [212/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:17.132 [213/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:17.132 [214/378] Linking static target lib/librte_hash.a 00:02:17.132 [215/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [216/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [217/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:17.392 [218/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:17.392 [219/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:17.392 [220/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:17.392 [221/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:17.392 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:17.392 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:17.392 [224/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:17.392 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:17.392 [226/378] Linking static target drivers/librte_bus_pci.a 00:02:17.392 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:17.392 [228/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:17.392 [229/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:17.392 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:17.392 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:17.392 [232/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:17.392 [233/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:17.392 [234/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:17.392 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:17.392 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:17.392 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:17.392 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:17.392 [239/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [240/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:17.392 [241/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [242/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:17.392 [243/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:17.392 [244/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:17.392 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:17.392 [246/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:17.392 [247/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:17.392 [248/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [249/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [250/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:17.392 [251/378] Linking static target lib/librte_cryptodev.a 00:02:17.392 [252/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:17.392 [253/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:17.392 [254/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [255/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.392 [256/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:17.651 [257/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:17.651 [258/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:17.651 [259/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.651 [260/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:17.651 [261/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:17.651 [262/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:17.651 [263/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:17.651 [264/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.651 [265/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:17.651 [266/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:17.651 [267/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:17.651 [268/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:17.651 [269/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:17.651 [270/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:17.651 [271/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:17.651 [272/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:17.651 [273/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:17.651 [274/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:17.651 [275/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:17.908 [276/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.908 [277/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:17.908 [278/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.908 [279/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:17.908 [280/378] Linking static target drivers/librte_mempool_ring.a 00:02:17.908 [281/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:17.908 [282/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:17.908 [283/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:17.908 [284/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:17.908 [285/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:17.908 [286/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.908 [287/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:17.908 [288/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:17.908 [289/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:17.908 [290/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:17.908 [291/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:17.908 [292/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:17.908 [293/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:17.908 [294/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:17.908 [295/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:17.908 [296/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:17.908 [297/378] Linking static target lib/librte_ethdev.a 00:02:17.908 [298/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:17.908 [299/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:17.908 [300/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.908 [301/378] Linking static target drivers/librte_compress_isal.a 00:02:18.166 [302/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:18.166 [303/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:18.166 [304/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:18.166 [305/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:18.166 [306/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:18.166 [307/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:18.166 [308/378] Linking static target drivers/librte_common_mlx5.a 00:02:18.166 [309/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.166 [310/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:18.166 [311/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:18.166 [312/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:18.166 [313/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:18.166 [314/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:18.166 [315/378] Linking static target drivers/librte_compress_mlx5.a 00:02:18.166 [316/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:18.166 [317/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:18.423 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:18.423 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:18.681 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:18.681 [321/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:18.681 [322/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:18.940 [323/378] Linking static target drivers/librte_common_qat.a 00:02:18.940 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:19.198 [325/378] Linking static target lib/librte_vhost.a 00:02:19.763 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.137 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.415 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.695 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.628 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.628 [331/378] Linking target lib/librte_eal.so.24.1 00:02:28.628 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:28.628 [333/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:28.628 [334/378] Linking target lib/librte_meter.so.24.1 00:02:28.628 [335/378] Linking target lib/librte_ring.so.24.1 00:02:28.628 [336/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:28.628 [337/378] Linking target lib/librte_timer.so.24.1 00:02:28.628 [338/378] Linking target lib/librte_pci.so.24.1 00:02:28.628 [339/378] Linking target lib/librte_dmadev.so.24.1 00:02:28.885 [340/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:28.885 [341/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:28.885 [342/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:28.885 [343/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:28.885 [344/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:28.885 [345/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:28.885 [346/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:28.885 [347/378] Linking target lib/librte_rcu.so.24.1 00:02:28.885 [348/378] Linking target lib/librte_mempool.so.24.1 00:02:28.885 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:28.885 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:28.885 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:28.885 [352/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:29.143 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:29.143 [354/378] Linking target lib/librte_mbuf.so.24.1 00:02:29.143 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:29.143 [356/378] Linking target lib/librte_net.so.24.1 00:02:29.143 [357/378] Linking target lib/librte_cryptodev.so.24.1 00:02:29.143 [358/378] Linking target lib/librte_reorder.so.24.1 00:02:29.143 [359/378] Linking target lib/librte_compressdev.so.24.1 00:02:29.401 [360/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:29.401 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:29.401 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:29.401 [363/378] Linking target lib/librte_hash.so.24.1 00:02:29.401 [364/378] Linking target lib/librte_cmdline.so.24.1 00:02:29.401 [365/378] Linking target lib/librte_security.so.24.1 00:02:29.401 [366/378] Linking target lib/librte_ethdev.so.24.1 00:02:29.401 [367/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:29.401 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:29.401 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:29.660 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:29.660 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:29.660 [372/378] Linking target lib/librte_power.so.24.1 00:02:29.660 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:29.660 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:29.660 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:29.660 [376/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:29.660 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:29.660 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:02:29.917 INFO: autodetecting backend as ninja 00:02:29.917 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:30.850 CC lib/log/log.o 00:02:30.850 CC lib/log/log_deprecated.o 00:02:30.850 CC lib/log/log_flags.o 00:02:30.850 CC lib/ut_mock/mock.o 00:02:30.850 CC lib/ut/ut.o 00:02:31.106 LIB libspdk_log.a 00:02:31.106 LIB libspdk_ut_mock.a 00:02:31.106 LIB libspdk_ut.a 00:02:31.106 SO libspdk_log.so.7.0 00:02:31.106 SO libspdk_ut_mock.so.6.0 00:02:31.106 SO libspdk_ut.so.2.0 00:02:31.106 SYMLINK libspdk_log.so 00:02:31.106 SYMLINK libspdk_ut.so 00:02:31.106 SYMLINK libspdk_ut_mock.so 00:02:31.362 CC lib/dma/dma.o 00:02:31.363 CC lib/ioat/ioat.o 00:02:31.363 CXX lib/trace_parser/trace.o 00:02:31.363 CC lib/util/base64.o 00:02:31.363 CC lib/util/bit_array.o 00:02:31.363 CC lib/util/cpuset.o 00:02:31.363 CC lib/util/crc32c.o 00:02:31.363 CC lib/util/crc16.o 00:02:31.363 CC lib/util/crc32.o 00:02:31.363 CC lib/util/crc32_ieee.o 00:02:31.363 CC lib/util/crc64.o 00:02:31.363 CC lib/util/dif.o 00:02:31.363 CC lib/util/fd.o 00:02:31.619 CC lib/util/file.o 00:02:31.619 CC lib/util/hexlify.o 00:02:31.619 CC lib/util/iov.o 00:02:31.619 CC lib/util/pipe.o 00:02:31.619 CC lib/util/math.o 00:02:31.619 CC lib/util/strerror_tls.o 00:02:31.619 CC lib/util/string.o 00:02:31.620 CC lib/util/uuid.o 00:02:31.620 CC lib/util/xor.o 00:02:31.620 CC lib/util/fd_group.o 00:02:31.620 CC lib/util/zipf.o 00:02:31.620 CC lib/vfio_user/host/vfio_user_pci.o 00:02:31.620 CC lib/vfio_user/host/vfio_user.o 00:02:31.620 LIB libspdk_dma.a 00:02:31.620 SO libspdk_dma.so.4.0 00:02:31.620 LIB libspdk_ioat.a 00:02:31.875 SYMLINK libspdk_dma.so 00:02:31.876 SO libspdk_ioat.so.7.0 00:02:31.876 SYMLINK libspdk_ioat.so 00:02:31.876 LIB libspdk_vfio_user.a 00:02:31.876 SO libspdk_vfio_user.so.5.0 00:02:31.876 LIB libspdk_util.a 00:02:31.876 SYMLINK libspdk_vfio_user.so 00:02:32.131 SO libspdk_util.so.9.1 00:02:32.131 SYMLINK libspdk_util.so 00:02:32.131 LIB libspdk_trace_parser.a 00:02:32.131 SO libspdk_trace_parser.so.5.0 00:02:32.388 SYMLINK libspdk_trace_parser.so 00:02:32.388 CC lib/reduce/reduce.o 00:02:32.388 CC lib/rdma_provider/common.o 00:02:32.388 CC lib/idxd/idxd_user.o 00:02:32.388 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:32.388 CC lib/idxd/idxd.o 00:02:32.388 CC lib/idxd/idxd_kernel.o 00:02:32.388 CC lib/rdma_utils/rdma_utils.o 00:02:32.388 CC lib/conf/conf.o 00:02:32.388 CC lib/env_dpdk/env.o 00:02:32.388 CC lib/env_dpdk/memory.o 00:02:32.388 CC lib/env_dpdk/init.o 00:02:32.388 CC lib/vmd/led.o 00:02:32.388 CC lib/env_dpdk/pci.o 00:02:32.388 CC lib/vmd/vmd.o 00:02:32.388 CC lib/env_dpdk/threads.o 00:02:32.646 CC lib/env_dpdk/pci_virtio.o 00:02:32.646 CC lib/env_dpdk/pci_ioat.o 00:02:32.646 CC lib/env_dpdk/pci_vmd.o 00:02:32.646 CC lib/env_dpdk/pci_event.o 00:02:32.646 CC lib/env_dpdk/pci_idxd.o 00:02:32.646 CC lib/env_dpdk/sigbus_handler.o 00:02:32.646 CC lib/env_dpdk/pci_dpdk.o 00:02:32.646 CC lib/json/json_parse.o 00:02:32.646 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:32.646 CC lib/json/json_util.o 00:02:32.646 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:32.646 CC lib/json/json_write.o 00:02:32.646 LIB libspdk_rdma_provider.a 00:02:32.646 LIB libspdk_conf.a 00:02:32.646 SO libspdk_rdma_provider.so.6.0 00:02:32.646 SO libspdk_conf.so.6.0 00:02:32.903 LIB libspdk_rdma_utils.a 00:02:32.903 LIB libspdk_json.a 00:02:32.903 SYMLINK libspdk_rdma_provider.so 00:02:32.903 SO libspdk_rdma_utils.so.1.0 00:02:32.903 SYMLINK libspdk_conf.so 00:02:32.903 SO libspdk_json.so.6.0 00:02:32.903 SYMLINK libspdk_rdma_utils.so 00:02:32.903 SYMLINK libspdk_json.so 00:02:32.903 LIB libspdk_idxd.a 00:02:32.903 SO libspdk_idxd.so.12.0 00:02:32.903 LIB libspdk_reduce.a 00:02:33.159 LIB libspdk_vmd.a 00:02:33.159 SYMLINK libspdk_idxd.so 00:02:33.159 SO libspdk_reduce.so.6.0 00:02:33.159 SO libspdk_vmd.so.6.0 00:02:33.159 SYMLINK libspdk_reduce.so 00:02:33.159 SYMLINK libspdk_vmd.so 00:02:33.159 CC lib/jsonrpc/jsonrpc_server.o 00:02:33.159 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:33.159 CC lib/jsonrpc/jsonrpc_client.o 00:02:33.159 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:33.416 LIB libspdk_jsonrpc.a 00:02:33.416 SO libspdk_jsonrpc.so.6.0 00:02:33.672 SYMLINK libspdk_jsonrpc.so 00:02:33.672 LIB libspdk_env_dpdk.a 00:02:33.672 SO libspdk_env_dpdk.so.14.1 00:02:33.672 SYMLINK libspdk_env_dpdk.so 00:02:33.929 CC lib/rpc/rpc.o 00:02:34.186 LIB libspdk_rpc.a 00:02:34.186 SO libspdk_rpc.so.6.0 00:02:34.186 SYMLINK libspdk_rpc.so 00:02:34.750 CC lib/trace/trace.o 00:02:34.750 CC lib/trace/trace_flags.o 00:02:34.750 CC lib/keyring/keyring.o 00:02:34.750 CC lib/trace/trace_rpc.o 00:02:34.750 CC lib/keyring/keyring_rpc.o 00:02:34.750 CC lib/notify/notify.o 00:02:34.750 CC lib/notify/notify_rpc.o 00:02:34.750 LIB libspdk_notify.a 00:02:34.750 SO libspdk_notify.so.6.0 00:02:34.750 LIB libspdk_keyring.a 00:02:34.750 LIB libspdk_trace.a 00:02:34.750 SO libspdk_keyring.so.1.0 00:02:34.750 SYMLINK libspdk_notify.so 00:02:34.750 SO libspdk_trace.so.10.0 00:02:35.008 SYMLINK libspdk_keyring.so 00:02:35.008 SYMLINK libspdk_trace.so 00:02:35.265 CC lib/sock/sock_rpc.o 00:02:35.265 CC lib/sock/sock.o 00:02:35.265 CC lib/thread/iobuf.o 00:02:35.265 CC lib/thread/thread.o 00:02:35.522 LIB libspdk_sock.a 00:02:35.522 SO libspdk_sock.so.10.0 00:02:35.522 SYMLINK libspdk_sock.so 00:02:36.086 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:36.086 CC lib/nvme/nvme_fabric.o 00:02:36.086 CC lib/nvme/nvme_ns_cmd.o 00:02:36.086 CC lib/nvme/nvme_ctrlr.o 00:02:36.086 CC lib/nvme/nvme_ns.o 00:02:36.086 CC lib/nvme/nvme_pcie_common.o 00:02:36.086 CC lib/nvme/nvme_pcie.o 00:02:36.086 CC lib/nvme/nvme.o 00:02:36.086 CC lib/nvme/nvme_qpair.o 00:02:36.086 CC lib/nvme/nvme_quirks.o 00:02:36.086 CC lib/nvme/nvme_transport.o 00:02:36.086 CC lib/nvme/nvme_discovery.o 00:02:36.086 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:36.086 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:36.086 CC lib/nvme/nvme_tcp.o 00:02:36.086 CC lib/nvme/nvme_opal.o 00:02:36.086 CC lib/nvme/nvme_io_msg.o 00:02:36.086 CC lib/nvme/nvme_poll_group.o 00:02:36.086 CC lib/nvme/nvme_zns.o 00:02:36.086 CC lib/nvme/nvme_stubs.o 00:02:36.086 CC lib/nvme/nvme_auth.o 00:02:36.086 CC lib/nvme/nvme_cuse.o 00:02:36.086 CC lib/nvme/nvme_rdma.o 00:02:36.343 LIB libspdk_thread.a 00:02:36.343 SO libspdk_thread.so.10.1 00:02:36.601 SYMLINK libspdk_thread.so 00:02:36.858 CC lib/init/json_config.o 00:02:36.858 CC lib/init/subsystem.o 00:02:36.858 CC lib/init/subsystem_rpc.o 00:02:36.858 CC lib/init/rpc.o 00:02:36.858 CC lib/virtio/virtio_vhost_user.o 00:02:36.858 CC lib/virtio/virtio.o 00:02:36.858 CC lib/virtio/virtio_vfio_user.o 00:02:36.858 CC lib/virtio/virtio_pci.o 00:02:36.858 CC lib/blob/zeroes.o 00:02:36.858 CC lib/blob/blobstore.o 00:02:36.858 CC lib/blob/request.o 00:02:36.858 CC lib/blob/blob_bs_dev.o 00:02:36.858 CC lib/accel/accel.o 00:02:36.858 CC lib/accel/accel_sw.o 00:02:36.858 CC lib/accel/accel_rpc.o 00:02:37.115 LIB libspdk_init.a 00:02:37.115 SO libspdk_init.so.5.0 00:02:37.115 LIB libspdk_virtio.a 00:02:37.115 SYMLINK libspdk_init.so 00:02:37.115 SO libspdk_virtio.so.7.0 00:02:37.115 SYMLINK libspdk_virtio.so 00:02:37.371 CC lib/event/reactor.o 00:02:37.371 CC lib/event/app.o 00:02:37.371 CC lib/event/app_rpc.o 00:02:37.371 CC lib/event/log_rpc.o 00:02:37.371 CC lib/event/scheduler_static.o 00:02:37.628 LIB libspdk_accel.a 00:02:37.628 SO libspdk_accel.so.15.1 00:02:37.628 LIB libspdk_nvme.a 00:02:37.628 SYMLINK libspdk_accel.so 00:02:37.886 SO libspdk_nvme.so.13.1 00:02:37.886 LIB libspdk_event.a 00:02:37.886 SO libspdk_event.so.14.0 00:02:37.886 SYMLINK libspdk_event.so 00:02:38.143 CC lib/bdev/bdev.o 00:02:38.143 CC lib/bdev/bdev_rpc.o 00:02:38.143 CC lib/bdev/part.o 00:02:38.143 CC lib/bdev/scsi_nvme.o 00:02:38.143 CC lib/bdev/bdev_zone.o 00:02:38.143 SYMLINK libspdk_nvme.so 00:02:39.074 LIB libspdk_blob.a 00:02:39.074 SO libspdk_blob.so.11.0 00:02:39.074 SYMLINK libspdk_blob.so 00:02:39.331 CC lib/blobfs/blobfs.o 00:02:39.331 CC lib/blobfs/tree.o 00:02:39.331 CC lib/lvol/lvol.o 00:02:39.897 LIB libspdk_bdev.a 00:02:39.897 SO libspdk_bdev.so.15.1 00:02:39.897 LIB libspdk_blobfs.a 00:02:39.897 SYMLINK libspdk_bdev.so 00:02:40.156 SO libspdk_blobfs.so.10.0 00:02:40.156 LIB libspdk_lvol.a 00:02:40.156 SYMLINK libspdk_blobfs.so 00:02:40.156 SO libspdk_lvol.so.10.0 00:02:40.156 SYMLINK libspdk_lvol.so 00:02:40.426 CC lib/ublk/ublk.o 00:02:40.426 CC lib/ublk/ublk_rpc.o 00:02:40.426 CC lib/nvmf/ctrlr.o 00:02:40.426 CC lib/nvmf/ctrlr_discovery.o 00:02:40.426 CC lib/nvmf/subsystem.o 00:02:40.426 CC lib/nvmf/ctrlr_bdev.o 00:02:40.426 CC lib/nvmf/nvmf.o 00:02:40.426 CC lib/nvmf/transport.o 00:02:40.426 CC lib/nvmf/nvmf_rpc.o 00:02:40.426 CC lib/nvmf/tcp.o 00:02:40.426 CC lib/scsi/lun.o 00:02:40.426 CC lib/scsi/dev.o 00:02:40.426 CC lib/scsi/scsi.o 00:02:40.426 CC lib/nvmf/stubs.o 00:02:40.426 CC lib/scsi/port.o 00:02:40.426 CC lib/nvmf/mdns_server.o 00:02:40.426 CC lib/nvmf/rdma.o 00:02:40.426 CC lib/scsi/scsi_bdev.o 00:02:40.426 CC lib/scsi/scsi_pr.o 00:02:40.426 CC lib/nvmf/auth.o 00:02:40.426 CC lib/nbd/nbd.o 00:02:40.426 CC lib/scsi/scsi_rpc.o 00:02:40.426 CC lib/nbd/nbd_rpc.o 00:02:40.426 CC lib/scsi/task.o 00:02:40.426 CC lib/ftl/ftl_core.o 00:02:40.426 CC lib/ftl/ftl_init.o 00:02:40.426 CC lib/ftl/ftl_layout.o 00:02:40.426 CC lib/ftl/ftl_debug.o 00:02:40.426 CC lib/ftl/ftl_io.o 00:02:40.426 CC lib/ftl/ftl_sb.o 00:02:40.426 CC lib/ftl/ftl_l2p.o 00:02:40.426 CC lib/ftl/ftl_l2p_flat.o 00:02:40.426 CC lib/ftl/ftl_nv_cache.o 00:02:40.426 CC lib/ftl/ftl_band.o 00:02:40.426 CC lib/ftl/ftl_band_ops.o 00:02:40.426 CC lib/ftl/ftl_rq.o 00:02:40.426 CC lib/ftl/ftl_writer.o 00:02:40.426 CC lib/ftl/ftl_reloc.o 00:02:40.426 CC lib/ftl/ftl_l2p_cache.o 00:02:40.426 CC lib/ftl/ftl_p2l.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:40.426 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:40.426 CC lib/ftl/utils/ftl_conf.o 00:02:40.426 CC lib/ftl/utils/ftl_md.o 00:02:40.426 CC lib/ftl/utils/ftl_mempool.o 00:02:40.426 CC lib/ftl/utils/ftl_bitmap.o 00:02:40.426 CC lib/ftl/utils/ftl_property.o 00:02:40.426 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:40.426 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:40.426 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:40.426 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:40.426 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:40.426 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:40.426 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:40.426 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:40.426 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:40.426 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:40.426 CC lib/ftl/base/ftl_base_dev.o 00:02:40.426 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:40.426 CC lib/ftl/base/ftl_base_bdev.o 00:02:40.426 CC lib/ftl/ftl_trace.o 00:02:40.993 LIB libspdk_nbd.a 00:02:40.993 SO libspdk_nbd.so.7.0 00:02:40.993 LIB libspdk_scsi.a 00:02:40.993 SYMLINK libspdk_nbd.so 00:02:40.993 SO libspdk_scsi.so.9.0 00:02:40.993 LIB libspdk_ublk.a 00:02:40.993 SO libspdk_ublk.so.3.0 00:02:40.993 SYMLINK libspdk_scsi.so 00:02:41.252 SYMLINK libspdk_ublk.so 00:02:41.252 LIB libspdk_ftl.a 00:02:41.510 CC lib/vhost/vhost.o 00:02:41.510 CC lib/vhost/vhost_rpc.o 00:02:41.510 CC lib/vhost/vhost_scsi.o 00:02:41.510 CC lib/vhost/vhost_blk.o 00:02:41.510 CC lib/iscsi/conn.o 00:02:41.510 CC lib/vhost/rte_vhost_user.o 00:02:41.510 CC lib/iscsi/init_grp.o 00:02:41.510 CC lib/iscsi/param.o 00:02:41.510 CC lib/iscsi/iscsi.o 00:02:41.510 CC lib/iscsi/md5.o 00:02:41.510 CC lib/iscsi/portal_grp.o 00:02:41.510 CC lib/iscsi/iscsi_rpc.o 00:02:41.510 CC lib/iscsi/tgt_node.o 00:02:41.510 CC lib/iscsi/iscsi_subsystem.o 00:02:41.510 CC lib/iscsi/task.o 00:02:41.510 SO libspdk_ftl.so.9.0 00:02:41.784 SYMLINK libspdk_ftl.so 00:02:42.087 LIB libspdk_nvmf.a 00:02:42.357 SO libspdk_nvmf.so.18.1 00:02:42.357 LIB libspdk_vhost.a 00:02:42.357 SO libspdk_vhost.so.8.0 00:02:42.357 SYMLINK libspdk_nvmf.so 00:02:42.357 SYMLINK libspdk_vhost.so 00:02:42.357 LIB libspdk_iscsi.a 00:02:42.616 SO libspdk_iscsi.so.8.0 00:02:42.616 SYMLINK libspdk_iscsi.so 00:02:43.184 CC module/env_dpdk/env_dpdk_rpc.o 00:02:43.184 CC module/accel/error/accel_error.o 00:02:43.184 CC module/accel/error/accel_error_rpc.o 00:02:43.184 CC module/blob/bdev/blob_bdev.o 00:02:43.184 LIB libspdk_env_dpdk_rpc.a 00:02:43.184 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:43.184 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:43.184 CC module/sock/posix/posix.o 00:02:43.184 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:43.443 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:43.443 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:43.443 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:43.443 CC module/keyring/file/keyring.o 00:02:43.443 CC module/keyring/linux/keyring.o 00:02:43.443 CC module/keyring/linux/keyring_rpc.o 00:02:43.443 CC module/keyring/file/keyring_rpc.o 00:02:43.443 CC module/accel/iaa/accel_iaa_rpc.o 00:02:43.443 CC module/accel/iaa/accel_iaa.o 00:02:43.443 CC module/accel/dsa/accel_dsa.o 00:02:43.443 CC module/accel/dsa/accel_dsa_rpc.o 00:02:43.443 CC module/scheduler/gscheduler/gscheduler.o 00:02:43.443 CC module/accel/ioat/accel_ioat.o 00:02:43.443 CC module/accel/ioat/accel_ioat_rpc.o 00:02:43.443 SO libspdk_env_dpdk_rpc.so.6.0 00:02:43.443 SYMLINK libspdk_env_dpdk_rpc.so 00:02:43.443 LIB libspdk_accel_error.a 00:02:43.443 LIB libspdk_keyring_linux.a 00:02:43.443 LIB libspdk_keyring_file.a 00:02:43.443 LIB libspdk_scheduler_gscheduler.a 00:02:43.443 SO libspdk_accel_error.so.2.0 00:02:43.443 LIB libspdk_scheduler_dpdk_governor.a 00:02:43.443 LIB libspdk_scheduler_dynamic.a 00:02:43.443 SO libspdk_keyring_linux.so.1.0 00:02:43.443 LIB libspdk_accel_iaa.a 00:02:43.443 SO libspdk_scheduler_gscheduler.so.4.0 00:02:43.443 LIB libspdk_blob_bdev.a 00:02:43.443 SO libspdk_keyring_file.so.1.0 00:02:43.443 SO libspdk_scheduler_dynamic.so.4.0 00:02:43.443 LIB libspdk_accel_ioat.a 00:02:43.443 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:43.443 SYMLINK libspdk_accel_error.so 00:02:43.443 SO libspdk_accel_ioat.so.6.0 00:02:43.702 SO libspdk_accel_iaa.so.3.0 00:02:43.702 SO libspdk_blob_bdev.so.11.0 00:02:43.702 SYMLINK libspdk_keyring_linux.so 00:02:43.702 SYMLINK libspdk_scheduler_dynamic.so 00:02:43.702 LIB libspdk_accel_dsa.a 00:02:43.702 SYMLINK libspdk_keyring_file.so 00:02:43.702 SYMLINK libspdk_scheduler_gscheduler.so 00:02:43.702 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:43.702 SO libspdk_accel_dsa.so.5.0 00:02:43.702 SYMLINK libspdk_accel_iaa.so 00:02:43.702 SYMLINK libspdk_blob_bdev.so 00:02:43.702 SYMLINK libspdk_accel_ioat.so 00:02:43.702 SYMLINK libspdk_accel_dsa.so 00:02:43.960 LIB libspdk_sock_posix.a 00:02:43.960 SO libspdk_sock_posix.so.6.0 00:02:43.960 SYMLINK libspdk_sock_posix.so 00:02:43.960 CC module/bdev/gpt/gpt.o 00:02:43.960 CC module/bdev/gpt/vbdev_gpt.o 00:02:43.960 CC module/bdev/lvol/vbdev_lvol.o 00:02:43.960 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:43.960 CC module/bdev/split/vbdev_split.o 00:02:43.960 CC module/bdev/error/vbdev_error_rpc.o 00:02:43.960 CC module/bdev/split/vbdev_split_rpc.o 00:02:43.960 CC module/bdev/error/vbdev_error.o 00:02:43.960 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:43.960 CC module/bdev/malloc/bdev_malloc.o 00:02:43.960 CC module/bdev/delay/vbdev_delay.o 00:02:43.960 CC module/bdev/compress/vbdev_compress.o 00:02:43.960 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:43.960 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:43.960 CC module/bdev/null/bdev_null.o 00:02:43.960 CC module/bdev/null/bdev_null_rpc.o 00:02:43.960 CC module/bdev/nvme/bdev_nvme.o 00:02:43.960 CC module/bdev/aio/bdev_aio.o 00:02:43.961 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:43.961 CC module/bdev/aio/bdev_aio_rpc.o 00:02:43.961 CC module/bdev/nvme/nvme_rpc.o 00:02:43.961 CC module/bdev/nvme/vbdev_opal.o 00:02:43.961 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:43.961 CC module/bdev/nvme/bdev_mdns_client.o 00:02:43.961 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:43.961 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:43.961 CC module/bdev/ftl/bdev_ftl.o 00:02:43.961 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:43.961 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:43.961 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:43.961 CC module/bdev/passthru/vbdev_passthru.o 00:02:43.961 CC module/bdev/raid/bdev_raid.o 00:02:43.961 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:43.961 CC module/bdev/raid/bdev_raid_sb.o 00:02:43.961 CC module/blobfs/bdev/blobfs_bdev.o 00:02:43.961 CC module/bdev/raid/raid0.o 00:02:43.961 CC module/bdev/raid/bdev_raid_rpc.o 00:02:43.961 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:43.961 CC module/bdev/raid/raid1.o 00:02:44.218 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:44.218 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:44.218 CC module/bdev/raid/concat.o 00:02:44.218 CC module/bdev/iscsi/bdev_iscsi.o 00:02:44.218 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:44.218 CC module/bdev/crypto/vbdev_crypto.o 00:02:44.218 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:44.218 LIB libspdk_accel_dpdk_compressdev.a 00:02:44.218 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:44.218 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:44.218 LIB libspdk_bdev_null.a 00:02:44.476 LIB libspdk_blobfs_bdev.a 00:02:44.476 LIB libspdk_bdev_split.a 00:02:44.476 LIB libspdk_bdev_error.a 00:02:44.476 SO libspdk_bdev_null.so.6.0 00:02:44.476 LIB libspdk_bdev_passthru.a 00:02:44.476 SO libspdk_blobfs_bdev.so.6.0 00:02:44.476 SO libspdk_bdev_split.so.6.0 00:02:44.476 SO libspdk_bdev_error.so.6.0 00:02:44.476 LIB libspdk_bdev_gpt.a 00:02:44.476 SO libspdk_bdev_passthru.so.6.0 00:02:44.476 LIB libspdk_bdev_crypto.a 00:02:44.476 LIB libspdk_bdev_malloc.a 00:02:44.476 LIB libspdk_bdev_ftl.a 00:02:44.476 SYMLINK libspdk_bdev_null.so 00:02:44.476 SO libspdk_bdev_gpt.so.6.0 00:02:44.476 SO libspdk_bdev_crypto.so.6.0 00:02:44.476 SYMLINK libspdk_bdev_split.so 00:02:44.476 LIB libspdk_bdev_compress.a 00:02:44.476 SYMLINK libspdk_blobfs_bdev.so 00:02:44.476 SYMLINK libspdk_bdev_passthru.so 00:02:44.476 SO libspdk_bdev_malloc.so.6.0 00:02:44.476 SO libspdk_bdev_ftl.so.6.0 00:02:44.476 SYMLINK libspdk_bdev_error.so 00:02:44.476 LIB libspdk_bdev_zone_block.a 00:02:44.476 SO libspdk_bdev_compress.so.6.0 00:02:44.476 LIB libspdk_bdev_aio.a 00:02:44.476 LIB libspdk_bdev_delay.a 00:02:44.476 SYMLINK libspdk_bdev_crypto.so 00:02:44.476 SYMLINK libspdk_bdev_gpt.so 00:02:44.476 SO libspdk_bdev_zone_block.so.6.0 00:02:44.476 SYMLINK libspdk_bdev_malloc.so 00:02:44.476 SYMLINK libspdk_bdev_ftl.so 00:02:44.476 LIB libspdk_accel_dpdk_cryptodev.a 00:02:44.476 LIB libspdk_bdev_iscsi.a 00:02:44.476 LIB libspdk_bdev_virtio.a 00:02:44.476 SO libspdk_bdev_aio.so.6.0 00:02:44.476 SO libspdk_bdev_delay.so.6.0 00:02:44.476 SYMLINK libspdk_bdev_compress.so 00:02:44.476 SYMLINK libspdk_bdev_zone_block.so 00:02:44.476 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:44.735 SO libspdk_bdev_iscsi.so.6.0 00:02:44.735 SO libspdk_bdev_virtio.so.6.0 00:02:44.735 LIB libspdk_bdev_lvol.a 00:02:44.735 SYMLINK libspdk_bdev_aio.so 00:02:44.735 SYMLINK libspdk_bdev_delay.so 00:02:44.735 SYMLINK libspdk_bdev_iscsi.so 00:02:44.735 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:44.735 SYMLINK libspdk_bdev_virtio.so 00:02:44.735 SO libspdk_bdev_lvol.so.6.0 00:02:44.735 SYMLINK libspdk_bdev_lvol.so 00:02:44.993 LIB libspdk_bdev_raid.a 00:02:44.993 SO libspdk_bdev_raid.so.6.0 00:02:45.251 SYMLINK libspdk_bdev_raid.so 00:02:45.818 LIB libspdk_bdev_nvme.a 00:02:45.818 SO libspdk_bdev_nvme.so.7.0 00:02:45.818 SYMLINK libspdk_bdev_nvme.so 00:02:46.754 CC module/event/subsystems/scheduler/scheduler.o 00:02:46.754 CC module/event/subsystems/vmd/vmd.o 00:02:46.754 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:46.754 CC module/event/subsystems/keyring/keyring.o 00:02:46.754 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:46.754 CC module/event/subsystems/iobuf/iobuf.o 00:02:46.754 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:46.754 CC module/event/subsystems/sock/sock.o 00:02:46.754 LIB libspdk_event_vmd.a 00:02:46.754 LIB libspdk_event_scheduler.a 00:02:46.754 LIB libspdk_event_keyring.a 00:02:46.754 LIB libspdk_event_sock.a 00:02:46.754 LIB libspdk_event_vhost_blk.a 00:02:46.754 SO libspdk_event_scheduler.so.4.0 00:02:46.754 SO libspdk_event_vmd.so.6.0 00:02:46.754 SO libspdk_event_vhost_blk.so.3.0 00:02:46.754 LIB libspdk_event_iobuf.a 00:02:46.754 SO libspdk_event_sock.so.5.0 00:02:46.754 SO libspdk_event_keyring.so.1.0 00:02:46.754 SYMLINK libspdk_event_vhost_blk.so 00:02:46.754 SYMLINK libspdk_event_scheduler.so 00:02:46.754 SO libspdk_event_iobuf.so.3.0 00:02:46.754 SYMLINK libspdk_event_sock.so 00:02:46.754 SYMLINK libspdk_event_vmd.so 00:02:46.755 SYMLINK libspdk_event_keyring.so 00:02:47.014 SYMLINK libspdk_event_iobuf.so 00:02:47.274 CC module/event/subsystems/accel/accel.o 00:02:47.533 LIB libspdk_event_accel.a 00:02:47.533 SO libspdk_event_accel.so.6.0 00:02:47.533 SYMLINK libspdk_event_accel.so 00:02:47.793 CC module/event/subsystems/bdev/bdev.o 00:02:48.053 LIB libspdk_event_bdev.a 00:02:48.053 SO libspdk_event_bdev.so.6.0 00:02:48.053 SYMLINK libspdk_event_bdev.so 00:02:48.621 CC module/event/subsystems/scsi/scsi.o 00:02:48.621 CC module/event/subsystems/ublk/ublk.o 00:02:48.621 CC module/event/subsystems/nbd/nbd.o 00:02:48.622 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:48.622 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:48.622 LIB libspdk_event_nbd.a 00:02:48.622 LIB libspdk_event_ublk.a 00:02:48.622 LIB libspdk_event_scsi.a 00:02:48.622 SO libspdk_event_nbd.so.6.0 00:02:48.622 SO libspdk_event_ublk.so.3.0 00:02:48.622 SO libspdk_event_scsi.so.6.0 00:02:48.622 LIB libspdk_event_nvmf.a 00:02:48.881 SYMLINK libspdk_event_nbd.so 00:02:48.881 SYMLINK libspdk_event_ublk.so 00:02:48.881 SYMLINK libspdk_event_scsi.so 00:02:48.881 SO libspdk_event_nvmf.so.6.0 00:02:48.881 SYMLINK libspdk_event_nvmf.so 00:02:49.141 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:49.141 CC module/event/subsystems/iscsi/iscsi.o 00:02:49.400 LIB libspdk_event_vhost_scsi.a 00:02:49.400 LIB libspdk_event_iscsi.a 00:02:49.400 SO libspdk_event_vhost_scsi.so.3.0 00:02:49.400 SO libspdk_event_iscsi.so.6.0 00:02:49.400 SYMLINK libspdk_event_vhost_scsi.so 00:02:49.400 SYMLINK libspdk_event_iscsi.so 00:02:49.660 SO libspdk.so.6.0 00:02:49.660 SYMLINK libspdk.so 00:02:49.921 CXX app/trace/trace.o 00:02:49.921 CC app/spdk_nvme_discover/discovery_aer.o 00:02:49.921 CC app/spdk_nvme_identify/identify.o 00:02:49.921 CC app/trace_record/trace_record.o 00:02:49.921 CC app/spdk_top/spdk_top.o 00:02:49.921 CC app/spdk_lspci/spdk_lspci.o 00:02:49.921 CC app/spdk_nvme_perf/perf.o 00:02:49.921 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:49.921 CC test/rpc_client/rpc_client_test.o 00:02:49.921 CC app/nvmf_tgt/nvmf_main.o 00:02:49.921 TEST_HEADER include/spdk/assert.h 00:02:49.921 TEST_HEADER include/spdk/accel.h 00:02:49.921 TEST_HEADER include/spdk/base64.h 00:02:49.921 TEST_HEADER include/spdk/accel_module.h 00:02:49.921 TEST_HEADER include/spdk/bdev.h 00:02:49.921 TEST_HEADER include/spdk/bdev_module.h 00:02:49.921 TEST_HEADER include/spdk/barrier.h 00:02:49.921 TEST_HEADER include/spdk/bit_array.h 00:02:49.921 TEST_HEADER include/spdk/bdev_zone.h 00:02:49.921 TEST_HEADER include/spdk/blob_bdev.h 00:02:49.921 CC app/iscsi_tgt/iscsi_tgt.o 00:02:49.921 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:49.921 TEST_HEADER include/spdk/bit_pool.h 00:02:49.921 TEST_HEADER include/spdk/blobfs.h 00:02:49.921 TEST_HEADER include/spdk/conf.h 00:02:49.921 TEST_HEADER include/spdk/blob.h 00:02:49.921 TEST_HEADER include/spdk/config.h 00:02:49.921 TEST_HEADER include/spdk/cpuset.h 00:02:49.921 TEST_HEADER include/spdk/crc16.h 00:02:49.921 TEST_HEADER include/spdk/crc32.h 00:02:49.921 TEST_HEADER include/spdk/dif.h 00:02:49.921 CC app/spdk_dd/spdk_dd.o 00:02:49.921 TEST_HEADER include/spdk/dma.h 00:02:49.921 TEST_HEADER include/spdk/crc64.h 00:02:49.921 TEST_HEADER include/spdk/endian.h 00:02:49.921 TEST_HEADER include/spdk/env_dpdk.h 00:02:49.921 TEST_HEADER include/spdk/env.h 00:02:49.921 TEST_HEADER include/spdk/event.h 00:02:49.921 TEST_HEADER include/spdk/fd.h 00:02:49.921 TEST_HEADER include/spdk/fd_group.h 00:02:49.921 TEST_HEADER include/spdk/ftl.h 00:02:49.921 TEST_HEADER include/spdk/file.h 00:02:49.921 TEST_HEADER include/spdk/gpt_spec.h 00:02:49.921 TEST_HEADER include/spdk/hexlify.h 00:02:49.921 TEST_HEADER include/spdk/histogram_data.h 00:02:49.921 CC app/spdk_tgt/spdk_tgt.o 00:02:49.921 TEST_HEADER include/spdk/idxd.h 00:02:49.921 TEST_HEADER include/spdk/idxd_spec.h 00:02:49.921 TEST_HEADER include/spdk/init.h 00:02:49.921 TEST_HEADER include/spdk/ioat.h 00:02:49.921 TEST_HEADER include/spdk/ioat_spec.h 00:02:49.921 TEST_HEADER include/spdk/json.h 00:02:49.921 TEST_HEADER include/spdk/iscsi_spec.h 00:02:49.921 TEST_HEADER include/spdk/jsonrpc.h 00:02:49.921 TEST_HEADER include/spdk/keyring.h 00:02:49.921 TEST_HEADER include/spdk/keyring_module.h 00:02:49.921 TEST_HEADER include/spdk/log.h 00:02:49.921 TEST_HEADER include/spdk/likely.h 00:02:49.921 TEST_HEADER include/spdk/lvol.h 00:02:49.921 TEST_HEADER include/spdk/memory.h 00:02:49.921 TEST_HEADER include/spdk/mmio.h 00:02:49.921 TEST_HEADER include/spdk/nbd.h 00:02:49.921 TEST_HEADER include/spdk/notify.h 00:02:49.921 TEST_HEADER include/spdk/nvme.h 00:02:49.921 TEST_HEADER include/spdk/nvme_intel.h 00:02:49.921 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:49.921 TEST_HEADER include/spdk/nvme_spec.h 00:02:49.921 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:49.921 TEST_HEADER include/spdk/nvme_zns.h 00:02:49.921 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:49.921 TEST_HEADER include/spdk/nvmf.h 00:02:49.921 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:49.921 TEST_HEADER include/spdk/nvmf_spec.h 00:02:49.921 TEST_HEADER include/spdk/nvmf_transport.h 00:02:49.921 TEST_HEADER include/spdk/opal.h 00:02:49.921 TEST_HEADER include/spdk/opal_spec.h 00:02:49.921 TEST_HEADER include/spdk/pci_ids.h 00:02:49.921 TEST_HEADER include/spdk/pipe.h 00:02:49.921 TEST_HEADER include/spdk/queue.h 00:02:49.921 TEST_HEADER include/spdk/reduce.h 00:02:49.921 TEST_HEADER include/spdk/scheduler.h 00:02:49.921 TEST_HEADER include/spdk/rpc.h 00:02:49.921 TEST_HEADER include/spdk/scsi.h 00:02:49.921 TEST_HEADER include/spdk/scsi_spec.h 00:02:49.921 TEST_HEADER include/spdk/sock.h 00:02:49.921 TEST_HEADER include/spdk/stdinc.h 00:02:49.921 TEST_HEADER include/spdk/string.h 00:02:49.921 TEST_HEADER include/spdk/thread.h 00:02:49.921 TEST_HEADER include/spdk/trace.h 00:02:49.921 TEST_HEADER include/spdk/trace_parser.h 00:02:49.921 TEST_HEADER include/spdk/tree.h 00:02:49.921 TEST_HEADER include/spdk/ublk.h 00:02:49.921 TEST_HEADER include/spdk/util.h 00:02:49.921 TEST_HEADER include/spdk/uuid.h 00:02:49.921 TEST_HEADER include/spdk/version.h 00:02:49.921 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:49.921 CC examples/util/zipf/zipf.o 00:02:49.921 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:49.921 TEST_HEADER include/spdk/vhost.h 00:02:49.921 TEST_HEADER include/spdk/vmd.h 00:02:49.921 TEST_HEADER include/spdk/zipf.h 00:02:49.921 TEST_HEADER include/spdk/xor.h 00:02:49.921 CXX test/cpp_headers/accel.o 00:02:49.921 CXX test/cpp_headers/accel_module.o 00:02:49.921 CXX test/cpp_headers/assert.o 00:02:49.921 CXX test/cpp_headers/barrier.o 00:02:49.921 CXX test/cpp_headers/base64.o 00:02:49.921 CXX test/cpp_headers/bdev.o 00:02:49.921 CXX test/cpp_headers/bdev_module.o 00:02:49.921 CXX test/cpp_headers/bdev_zone.o 00:02:49.921 CXX test/cpp_headers/bit_array.o 00:02:49.921 CXX test/cpp_headers/bit_pool.o 00:02:49.921 CXX test/cpp_headers/blob_bdev.o 00:02:49.921 CXX test/cpp_headers/blobfs_bdev.o 00:02:49.921 CXX test/cpp_headers/blobfs.o 00:02:49.921 CXX test/cpp_headers/blob.o 00:02:49.921 CXX test/cpp_headers/conf.o 00:02:49.921 CXX test/cpp_headers/config.o 00:02:49.921 CXX test/cpp_headers/cpuset.o 00:02:49.921 CXX test/cpp_headers/crc16.o 00:02:49.921 CXX test/cpp_headers/crc32.o 00:02:49.921 CXX test/cpp_headers/dif.o 00:02:49.921 CXX test/cpp_headers/crc64.o 00:02:49.921 CC examples/ioat/verify/verify.o 00:02:49.921 CXX test/cpp_headers/endian.o 00:02:49.921 CXX test/cpp_headers/dma.o 00:02:49.921 CXX test/cpp_headers/env_dpdk.o 00:02:49.921 CXX test/cpp_headers/env.o 00:02:49.921 CXX test/cpp_headers/event.o 00:02:49.921 CXX test/cpp_headers/fd_group.o 00:02:49.921 CXX test/cpp_headers/fd.o 00:02:49.921 CXX test/cpp_headers/file.o 00:02:49.921 CXX test/cpp_headers/ftl.o 00:02:49.921 CXX test/cpp_headers/gpt_spec.o 00:02:49.921 CXX test/cpp_headers/histogram_data.o 00:02:49.921 CXX test/cpp_headers/hexlify.o 00:02:49.921 CXX test/cpp_headers/idxd.o 00:02:49.921 CXX test/cpp_headers/idxd_spec.o 00:02:49.921 CXX test/cpp_headers/ioat.o 00:02:49.921 CC examples/ioat/perf/perf.o 00:02:49.921 CXX test/cpp_headers/init.o 00:02:49.921 CXX test/cpp_headers/ioat_spec.o 00:02:49.921 CXX test/cpp_headers/iscsi_spec.o 00:02:49.921 CXX test/cpp_headers/json.o 00:02:49.921 CXX test/cpp_headers/jsonrpc.o 00:02:50.189 CXX test/cpp_headers/keyring.o 00:02:50.189 CC test/app/jsoncat/jsoncat.o 00:02:50.189 CC app/fio/nvme/fio_plugin.o 00:02:50.189 CC test/thread/poller_perf/poller_perf.o 00:02:50.189 CC test/app/stub/stub.o 00:02:50.189 CC test/env/vtophys/vtophys.o 00:02:50.189 CC test/env/memory/memory_ut.o 00:02:50.189 CC test/env/pci/pci_ut.o 00:02:50.189 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:50.189 LINK spdk_lspci 00:02:50.189 CC app/fio/bdev/fio_plugin.o 00:02:50.189 CC test/dma/test_dma/test_dma.o 00:02:50.189 CC test/app/histogram_perf/histogram_perf.o 00:02:50.189 CC test/app/bdev_svc/bdev_svc.o 00:02:50.189 LINK spdk_nvme_discover 00:02:50.189 LINK interrupt_tgt 00:02:50.189 LINK rpc_client_test 00:02:50.189 LINK nvmf_tgt 00:02:50.451 LINK spdk_trace_record 00:02:50.451 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:50.451 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:50.451 CC test/env/mem_callbacks/mem_callbacks.o 00:02:50.451 LINK iscsi_tgt 00:02:50.451 LINK zipf 00:02:50.451 LINK spdk_tgt 00:02:50.451 LINK vtophys 00:02:50.451 LINK jsoncat 00:02:50.451 CXX test/cpp_headers/keyring_module.o 00:02:50.451 LINK verify 00:02:50.451 CXX test/cpp_headers/likely.o 00:02:50.451 CXX test/cpp_headers/log.o 00:02:50.451 LINK stub 00:02:50.451 CXX test/cpp_headers/lvol.o 00:02:50.451 CXX test/cpp_headers/memory.o 00:02:50.451 CXX test/cpp_headers/mmio.o 00:02:50.451 CXX test/cpp_headers/nbd.o 00:02:50.451 CXX test/cpp_headers/notify.o 00:02:50.451 CXX test/cpp_headers/nvme.o 00:02:50.451 CXX test/cpp_headers/nvme_intel.o 00:02:50.451 LINK poller_perf 00:02:50.451 CXX test/cpp_headers/nvme_ocssd.o 00:02:50.451 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:50.451 CXX test/cpp_headers/nvme_spec.o 00:02:50.451 CXX test/cpp_headers/nvme_zns.o 00:02:50.451 CXX test/cpp_headers/nvmf_cmd.o 00:02:50.451 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:50.451 LINK ioat_perf 00:02:50.451 CXX test/cpp_headers/nvmf.o 00:02:50.451 CXX test/cpp_headers/nvmf_spec.o 00:02:50.451 CXX test/cpp_headers/nvmf_transport.o 00:02:50.451 LINK histogram_perf 00:02:50.451 CXX test/cpp_headers/opal.o 00:02:50.452 CXX test/cpp_headers/opal_spec.o 00:02:50.452 CXX test/cpp_headers/pci_ids.o 00:02:50.452 CXX test/cpp_headers/pipe.o 00:02:50.714 LINK env_dpdk_post_init 00:02:50.714 CXX test/cpp_headers/queue.o 00:02:50.714 CXX test/cpp_headers/reduce.o 00:02:50.714 CXX test/cpp_headers/rpc.o 00:02:50.714 CXX test/cpp_headers/scheduler.o 00:02:50.714 CXX test/cpp_headers/scsi_spec.o 00:02:50.714 CXX test/cpp_headers/scsi.o 00:02:50.714 CXX test/cpp_headers/sock.o 00:02:50.714 CXX test/cpp_headers/stdinc.o 00:02:50.714 CXX test/cpp_headers/thread.o 00:02:50.714 CXX test/cpp_headers/string.o 00:02:50.714 CXX test/cpp_headers/trace.o 00:02:50.714 CXX test/cpp_headers/trace_parser.o 00:02:50.714 CXX test/cpp_headers/tree.o 00:02:50.714 CXX test/cpp_headers/util.o 00:02:50.714 CXX test/cpp_headers/ublk.o 00:02:50.714 CXX test/cpp_headers/uuid.o 00:02:50.714 CXX test/cpp_headers/version.o 00:02:50.714 CXX test/cpp_headers/vfio_user_pci.o 00:02:50.714 CXX test/cpp_headers/vfio_user_spec.o 00:02:50.714 CXX test/cpp_headers/vhost.o 00:02:50.714 LINK bdev_svc 00:02:50.714 CXX test/cpp_headers/vmd.o 00:02:50.714 CXX test/cpp_headers/xor.o 00:02:50.714 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:50.714 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:50.714 CXX test/cpp_headers/zipf.o 00:02:50.714 LINK spdk_trace 00:02:50.714 LINK spdk_dd 00:02:50.973 LINK pci_ut 00:02:50.973 LINK test_dma 00:02:50.973 LINK nvme_fuzz 00:02:50.973 LINK spdk_nvme 00:02:50.973 LINK spdk_bdev 00:02:50.973 CC examples/vmd/led/led.o 00:02:50.973 CC examples/vmd/lsvmd/lsvmd.o 00:02:50.973 CC examples/sock/hello_world/hello_sock.o 00:02:50.973 CC examples/idxd/perf/perf.o 00:02:50.973 CC test/event/reactor/reactor.o 00:02:50.973 CC test/event/event_perf/event_perf.o 00:02:50.973 CC examples/thread/thread/thread_ex.o 00:02:50.973 CC test/event/reactor_perf/reactor_perf.o 00:02:51.231 CC test/event/app_repeat/app_repeat.o 00:02:51.231 CC test/event/scheduler/scheduler.o 00:02:51.231 LINK spdk_nvme_identify 00:02:51.231 LINK spdk_nvme_perf 00:02:51.231 LINK spdk_top 00:02:51.231 LINK lsvmd 00:02:51.231 LINK reactor 00:02:51.231 LINK led 00:02:51.231 LINK reactor_perf 00:02:51.231 LINK event_perf 00:02:51.231 CC app/vhost/vhost.o 00:02:51.231 LINK mem_callbacks 00:02:51.231 LINK vhost_fuzz 00:02:51.231 LINK hello_sock 00:02:51.231 LINK app_repeat 00:02:51.231 LINK scheduler 00:02:51.231 LINK thread 00:02:51.490 LINK idxd_perf 00:02:51.490 LINK vhost 00:02:51.490 CC test/nvme/aer/aer.o 00:02:51.490 CC test/nvme/reserve/reserve.o 00:02:51.490 CC test/nvme/simple_copy/simple_copy.o 00:02:51.490 CC test/nvme/startup/startup.o 00:02:51.490 CC test/nvme/err_injection/err_injection.o 00:02:51.490 CC test/nvme/e2edp/nvme_dp.o 00:02:51.490 CC test/nvme/overhead/overhead.o 00:02:51.490 CC test/nvme/boot_partition/boot_partition.o 00:02:51.490 CC test/nvme/fdp/fdp.o 00:02:51.490 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:51.490 LINK memory_ut 00:02:51.490 CC test/nvme/connect_stress/connect_stress.o 00:02:51.490 CC test/nvme/fused_ordering/fused_ordering.o 00:02:51.490 CC test/nvme/cuse/cuse.o 00:02:51.490 CC test/nvme/sgl/sgl.o 00:02:51.490 CC test/nvme/reset/reset.o 00:02:51.490 CC test/nvme/compliance/nvme_compliance.o 00:02:51.490 CC test/accel/dif/dif.o 00:02:51.490 CC test/blobfs/mkfs/mkfs.o 00:02:51.490 CC test/lvol/esnap/esnap.o 00:02:51.749 LINK startup 00:02:51.749 LINK boot_partition 00:02:51.749 LINK connect_stress 00:02:51.749 LINK doorbell_aers 00:02:51.749 LINK fused_ordering 00:02:51.749 LINK reserve 00:02:51.749 LINK err_injection 00:02:51.749 LINK simple_copy 00:02:51.749 LINK mkfs 00:02:51.749 LINK reset 00:02:51.749 LINK nvme_dp 00:02:51.749 CC examples/nvme/abort/abort.o 00:02:51.749 LINK aer 00:02:51.749 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:51.749 LINK sgl 00:02:51.749 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:51.749 CC examples/nvme/arbitration/arbitration.o 00:02:51.749 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:51.749 CC examples/nvme/hotplug/hotplug.o 00:02:51.749 CC examples/nvme/reconnect/reconnect.o 00:02:51.749 CC examples/nvme/hello_world/hello_world.o 00:02:51.749 LINK overhead 00:02:51.749 LINK fdp 00:02:51.749 LINK nvme_compliance 00:02:51.749 CC examples/accel/perf/accel_perf.o 00:02:51.749 CC examples/blob/cli/blobcli.o 00:02:52.009 CC examples/blob/hello_world/hello_blob.o 00:02:52.009 LINK dif 00:02:52.009 LINK cmb_copy 00:02:52.009 LINK pmr_persistence 00:02:52.009 LINK hello_world 00:02:52.009 LINK hotplug 00:02:52.009 LINK abort 00:02:52.009 LINK arbitration 00:02:52.009 LINK reconnect 00:02:52.009 LINK iscsi_fuzz 00:02:52.268 LINK hello_blob 00:02:52.268 LINK nvme_manage 00:02:52.268 LINK accel_perf 00:02:52.268 LINK blobcli 00:02:52.527 LINK cuse 00:02:52.527 CC test/bdev/bdevio/bdevio.o 00:02:52.786 CC examples/bdev/bdevperf/bdevperf.o 00:02:52.786 CC examples/bdev/hello_world/hello_bdev.o 00:02:52.786 LINK bdevio 00:02:53.045 LINK hello_bdev 00:02:53.304 LINK bdevperf 00:02:53.871 CC examples/nvmf/nvmf/nvmf.o 00:02:54.129 LINK nvmf 00:02:55.066 LINK esnap 00:02:55.640 00:02:55.640 real 1m14.363s 00:02:55.640 user 14m22.357s 00:02:55.640 sys 3m49.344s 00:02:55.640 13:25:42 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:55.640 13:25:42 make -- common/autotest_common.sh@10 -- $ set +x 00:02:55.640 ************************************ 00:02:55.640 END TEST make 00:02:55.640 ************************************ 00:02:55.640 13:25:43 -- common/autotest_common.sh@1142 -- $ return 0 00:02:55.640 13:25:43 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:55.640 13:25:43 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:55.640 13:25:43 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:55.640 13:25:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.640 13:25:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:55.640 13:25:43 -- pm/common@44 -- $ pid=4018175 00:02:55.640 13:25:43 -- pm/common@50 -- $ kill -TERM 4018175 00:02:55.640 13:25:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.640 13:25:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:55.640 13:25:43 -- pm/common@44 -- $ pid=4018177 00:02:55.640 13:25:43 -- pm/common@50 -- $ kill -TERM 4018177 00:02:55.640 13:25:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.640 13:25:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:55.640 13:25:43 -- pm/common@44 -- $ pid=4018179 00:02:55.640 13:25:43 -- pm/common@50 -- $ kill -TERM 4018179 00:02:55.640 13:25:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.640 13:25:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:55.640 13:25:43 -- pm/common@44 -- $ pid=4018202 00:02:55.640 13:25:43 -- pm/common@50 -- $ sudo -E kill -TERM 4018202 00:02:55.640 13:25:43 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:55.640 13:25:43 -- nvmf/common.sh@7 -- # uname -s 00:02:55.640 13:25:43 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:55.640 13:25:43 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:55.640 13:25:43 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:55.640 13:25:43 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:55.640 13:25:43 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:55.640 13:25:43 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:55.640 13:25:43 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:55.640 13:25:43 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:55.640 13:25:43 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:55.640 13:25:43 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:55.640 13:25:43 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00d40ca9-2a78-e711-906e-0017a4403562 00:02:55.640 13:25:43 -- nvmf/common.sh@18 -- # NVME_HOSTID=00d40ca9-2a78-e711-906e-0017a4403562 00:02:55.640 13:25:43 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:55.640 13:25:43 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:55.640 13:25:43 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:55.640 13:25:43 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:55.640 13:25:43 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:55.640 13:25:43 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:55.640 13:25:43 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:55.640 13:25:43 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:55.640 13:25:43 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.640 13:25:43 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.640 13:25:43 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.640 13:25:43 -- paths/export.sh@5 -- # export PATH 00:02:55.640 13:25:43 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.640 13:25:43 -- nvmf/common.sh@47 -- # : 0 00:02:55.640 13:25:43 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:55.640 13:25:43 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:55.641 13:25:43 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:55.641 13:25:43 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:55.641 13:25:43 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:55.641 13:25:43 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:55.641 13:25:43 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:55.641 13:25:43 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:55.641 13:25:43 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:55.641 13:25:43 -- spdk/autotest.sh@32 -- # uname -s 00:02:55.641 13:25:43 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:55.641 13:25:43 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:55.641 13:25:43 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:55.641 13:25:43 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:55.641 13:25:43 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:55.641 13:25:43 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:55.641 13:25:43 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:55.641 13:25:43 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:55.641 13:25:43 -- spdk/autotest.sh@48 -- # udevadm_pid=4082679 00:02:55.641 13:25:43 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:55.641 13:25:43 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:55.641 13:25:43 -- pm/common@17 -- # local monitor 00:02:55.641 13:25:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.641 13:25:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.641 13:25:43 -- pm/common@21 -- # date +%s 00:02:55.641 13:25:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.641 13:25:43 -- pm/common@21 -- # date +%s 00:02:55.641 13:25:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:55.641 13:25:43 -- pm/common@21 -- # date +%s 00:02:55.641 13:25:43 -- pm/common@25 -- # sleep 1 00:02:55.641 13:25:43 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042743 00:02:55.641 13:25:43 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042743 00:02:55.641 13:25:43 -- pm/common@21 -- # date +%s 00:02:55.641 13:25:43 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042743 00:02:55.641 13:25:43 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042743 00:02:55.641 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042743_collect-vmstat.pm.log 00:02:55.641 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042743_collect-cpu-temp.pm.log 00:02:55.641 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042743_collect-cpu-load.pm.log 00:02:55.898 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042743_collect-bmc-pm.bmc.pm.log 00:02:56.832 13:25:44 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:56.832 13:25:44 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:56.832 13:25:44 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:56.832 13:25:44 -- common/autotest_common.sh@10 -- # set +x 00:02:56.832 13:25:44 -- spdk/autotest.sh@59 -- # create_test_list 00:02:56.832 13:25:44 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:56.832 13:25:44 -- common/autotest_common.sh@10 -- # set +x 00:02:56.832 13:25:44 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:56.832 13:25:44 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:56.832 13:25:44 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:56.832 13:25:44 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:56.832 13:25:44 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:56.832 13:25:44 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:56.832 13:25:44 -- common/autotest_common.sh@1455 -- # uname 00:02:56.832 13:25:44 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:56.832 13:25:44 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:56.832 13:25:44 -- common/autotest_common.sh@1475 -- # uname 00:02:56.832 13:25:44 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:56.832 13:25:44 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:56.832 13:25:44 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:56.832 13:25:44 -- spdk/autotest.sh@72 -- # hash lcov 00:02:56.832 13:25:44 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:56.832 13:25:44 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:56.832 --rc lcov_branch_coverage=1 00:02:56.832 --rc lcov_function_coverage=1 00:02:56.832 --rc genhtml_branch_coverage=1 00:02:56.832 --rc genhtml_function_coverage=1 00:02:56.832 --rc genhtml_legend=1 00:02:56.832 --rc geninfo_all_blocks=1 00:02:56.832 ' 00:02:56.832 13:25:44 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:56.832 --rc lcov_branch_coverage=1 00:02:56.832 --rc lcov_function_coverage=1 00:02:56.832 --rc genhtml_branch_coverage=1 00:02:56.832 --rc genhtml_function_coverage=1 00:02:56.832 --rc genhtml_legend=1 00:02:56.832 --rc geninfo_all_blocks=1 00:02:56.832 ' 00:02:56.832 13:25:44 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:56.832 --rc lcov_branch_coverage=1 00:02:56.832 --rc lcov_function_coverage=1 00:02:56.832 --rc genhtml_branch_coverage=1 00:02:56.832 --rc genhtml_function_coverage=1 00:02:56.832 --rc genhtml_legend=1 00:02:56.832 --rc geninfo_all_blocks=1 00:02:56.832 --no-external' 00:02:56.832 13:25:44 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:56.832 --rc lcov_branch_coverage=1 00:02:56.832 --rc lcov_function_coverage=1 00:02:56.832 --rc genhtml_branch_coverage=1 00:02:56.832 --rc genhtml_function_coverage=1 00:02:56.832 --rc genhtml_legend=1 00:02:56.832 --rc geninfo_all_blocks=1 00:02:56.832 --no-external' 00:02:56.832 13:25:44 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:56.832 lcov: LCOV version 1.14 00:02:56.832 13:25:44 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:09.040 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:09.040 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:19.022 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:19.022 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:19.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:19.023 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:19.024 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:19.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:22.307 13:26:09 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:22.307 13:26:09 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:22.307 13:26:09 -- common/autotest_common.sh@10 -- # set +x 00:03:22.307 13:26:09 -- spdk/autotest.sh@91 -- # rm -f 00:03:22.307 13:26:09 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:25.719 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:25.719 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:03:25.719 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:25.719 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:25.719 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:25.720 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:25.720 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:25.720 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:25.720 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:25.720 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:25.720 13:26:13 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:25.720 13:26:13 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:25.720 13:26:13 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:25.720 13:26:13 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:25.720 13:26:13 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:25.720 13:26:13 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:25.720 13:26:13 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:25.720 13:26:13 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:25.720 13:26:13 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:25.720 13:26:13 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:25.720 13:26:13 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:25.720 13:26:13 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:25.720 13:26:13 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:25.720 13:26:13 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:25.720 13:26:13 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:25.720 No valid GPT data, bailing 00:03:25.720 13:26:13 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:25.979 13:26:13 -- scripts/common.sh@391 -- # pt= 00:03:25.979 13:26:13 -- scripts/common.sh@392 -- # return 1 00:03:25.979 13:26:13 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:25.979 1+0 records in 00:03:25.979 1+0 records out 00:03:25.979 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00604433 s, 173 MB/s 00:03:25.979 13:26:13 -- spdk/autotest.sh@118 -- # sync 00:03:25.979 13:26:13 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:25.979 13:26:13 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:25.979 13:26:13 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:30.174 13:26:17 -- spdk/autotest.sh@124 -- # uname -s 00:03:30.174 13:26:17 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:30.174 13:26:17 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:30.174 13:26:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:30.174 13:26:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:30.174 13:26:17 -- common/autotest_common.sh@10 -- # set +x 00:03:30.174 ************************************ 00:03:30.174 START TEST setup.sh 00:03:30.174 ************************************ 00:03:30.174 13:26:17 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:30.174 * Looking for test storage... 00:03:30.174 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:30.174 13:26:17 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:30.174 13:26:17 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:30.174 13:26:17 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:30.174 13:26:17 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:30.174 13:26:17 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:30.174 13:26:17 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:30.174 ************************************ 00:03:30.174 START TEST acl 00:03:30.174 ************************************ 00:03:30.174 13:26:17 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:30.433 * Looking for test storage... 00:03:30.433 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:30.433 13:26:17 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:30.433 13:26:17 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:30.433 13:26:17 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:30.433 13:26:17 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:30.433 13:26:17 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:30.433 13:26:17 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:30.433 13:26:17 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:30.433 13:26:17 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:30.433 13:26:17 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:34.627 13:26:21 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:34.627 13:26:21 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:34.627 13:26:21 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.627 13:26:21 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:34.627 13:26:21 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.627 13:26:21 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 Hugepages 00:03:37.916 node hugesize free / total 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 00:03:37.916 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ - == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:ae:05.5 == *:*:*.* ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ - == nvme ]] 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.916 13:26:25 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:37.917 13:26:25 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:37.917 13:26:25 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:37.917 13:26:25 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:37.917 13:26:25 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:37.917 ************************************ 00:03:37.917 START TEST denied 00:03:37.917 ************************************ 00:03:37.917 13:26:25 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:37.917 13:26:25 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:03:37.917 13:26:25 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:37.917 13:26:25 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:37.917 13:26:25 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.917 13:26:25 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:41.208 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:41.208 13:26:28 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:46.482 00:03:46.482 real 0m7.995s 00:03:46.482 user 0m2.407s 00:03:46.482 sys 0m4.823s 00:03:46.482 13:26:33 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:46.482 13:26:33 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:46.482 ************************************ 00:03:46.482 END TEST denied 00:03:46.482 ************************************ 00:03:46.482 13:26:33 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:46.482 13:26:33 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:46.482 13:26:33 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:46.482 13:26:33 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.482 13:26:33 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:46.482 ************************************ 00:03:46.482 START TEST allowed 00:03:46.482 ************************************ 00:03:46.482 13:26:33 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:03:46.482 13:26:33 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:46.482 13:26:33 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:46.482 13:26:33 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:46.482 13:26:33 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.482 13:26:33 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:50.676 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:03:50.676 13:26:37 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:50.676 13:26:37 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:50.676 13:26:37 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:50.676 13:26:37 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:50.676 13:26:37 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:53.969 00:03:53.969 real 0m8.385s 00:03:53.969 user 0m2.567s 00:03:53.969 sys 0m4.675s 00:03:53.969 13:26:41 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:53.969 13:26:41 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:53.969 ************************************ 00:03:53.969 END TEST allowed 00:03:53.969 ************************************ 00:03:54.229 13:26:41 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:54.230 00:03:54.230 real 0m23.848s 00:03:54.230 user 0m7.558s 00:03:54.230 sys 0m14.638s 00:03:54.230 13:26:41 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:54.230 13:26:41 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:54.230 ************************************ 00:03:54.230 END TEST acl 00:03:54.230 ************************************ 00:03:54.230 13:26:41 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:03:54.230 13:26:41 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.230 13:26:41 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:54.230 13:26:41 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.230 13:26:41 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:54.230 ************************************ 00:03:54.230 START TEST hugepages 00:03:54.230 ************************************ 00:03:54.230 13:26:41 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.230 * Looking for test storage... 00:03:54.230 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 69984196 kB' 'MemAvailable: 74690560 kB' 'Buffers: 16532 kB' 'Cached: 15712304 kB' 'SwapCached: 0 kB' 'Active: 11667200 kB' 'Inactive: 4621360 kB' 'Active(anon): 11261756 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563620 kB' 'Mapped: 225840 kB' 'Shmem: 10702032 kB' 'KReclaimable: 515544 kB' 'Slab: 895424 kB' 'SReclaimable: 515544 kB' 'SUnreclaim: 379880 kB' 'KernelStack: 15696 kB' 'PageTables: 8912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438220 kB' 'Committed_AS: 12660092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201488 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.230 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.231 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:54.490 13:26:41 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:54.490 13:26:41 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:54.490 13:26:41 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.490 13:26:41 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:54.490 ************************************ 00:03:54.490 START TEST default_setup 00:03:54.490 ************************************ 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.490 13:26:41 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:57.779 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:57.779 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:03:57.779 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:57.779 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:57.779 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:57.779 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:58.040 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:59.426 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72134940 kB' 'MemAvailable: 76840920 kB' 'Buffers: 16532 kB' 'Cached: 15712408 kB' 'SwapCached: 0 kB' 'Active: 11685124 kB' 'Inactive: 4621360 kB' 'Active(anon): 11279680 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580880 kB' 'Mapped: 226460 kB' 'Shmem: 10702136 kB' 'KReclaimable: 515160 kB' 'Slab: 894572 kB' 'SReclaimable: 515160 kB' 'SUnreclaim: 379412 kB' 'KernelStack: 15856 kB' 'PageTables: 8836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12688596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201792 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.426 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72137704 kB' 'MemAvailable: 76843684 kB' 'Buffers: 16532 kB' 'Cached: 15712412 kB' 'SwapCached: 0 kB' 'Active: 11690748 kB' 'Inactive: 4621360 kB' 'Active(anon): 11285304 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 586504 kB' 'Mapped: 226460 kB' 'Shmem: 10702140 kB' 'KReclaimable: 515160 kB' 'Slab: 894548 kB' 'SReclaimable: 515160 kB' 'SUnreclaim: 379388 kB' 'KernelStack: 16160 kB' 'PageTables: 10008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12694068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201812 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.427 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.428 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72137296 kB' 'MemAvailable: 76843276 kB' 'Buffers: 16532 kB' 'Cached: 15712412 kB' 'SwapCached: 0 kB' 'Active: 11685668 kB' 'Inactive: 4621360 kB' 'Active(anon): 11280224 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 581408 kB' 'Mapped: 226336 kB' 'Shmem: 10702140 kB' 'KReclaimable: 515160 kB' 'Slab: 894584 kB' 'SReclaimable: 515160 kB' 'SUnreclaim: 379424 kB' 'KernelStack: 16208 kB' 'PageTables: 10484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12687968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201760 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.429 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:59.430 nr_hugepages=1024 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:59.430 resv_hugepages=0 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:59.430 surplus_hugepages=0 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:59.430 anon_hugepages=0 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72136916 kB' 'MemAvailable: 76842896 kB' 'Buffers: 16532 kB' 'Cached: 15712452 kB' 'SwapCached: 0 kB' 'Active: 11685032 kB' 'Inactive: 4621360 kB' 'Active(anon): 11279588 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580724 kB' 'Mapped: 225940 kB' 'Shmem: 10702180 kB' 'KReclaimable: 515160 kB' 'Slab: 894552 kB' 'SReclaimable: 515160 kB' 'SUnreclaim: 379392 kB' 'KernelStack: 16144 kB' 'PageTables: 10228 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12687992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201792 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.430 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:59.431 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069932 kB' 'MemFree: 33490464 kB' 'MemUsed: 14579468 kB' 'SwapCached: 0 kB' 'Active: 8097644 kB' 'Inactive: 3476228 kB' 'Active(anon): 7913624 kB' 'Inactive(anon): 0 kB' 'Active(file): 184020 kB' 'Inactive(file): 3476228 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11189264 kB' 'Mapped: 141136 kB' 'AnonPages: 387816 kB' 'Shmem: 7529016 kB' 'KernelStack: 9640 kB' 'PageTables: 7180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186324 kB' 'Slab: 416288 kB' 'SReclaimable: 186324 kB' 'SUnreclaim: 229964 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.432 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:59.433 node0=1024 expecting 1024 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:59.433 00:03:59.433 real 0m5.099s 00:03:59.433 user 0m1.446s 00:03:59.433 sys 0m2.457s 00:03:59.433 13:26:46 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:59.433 13:26:47 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:59.433 ************************************ 00:03:59.433 END TEST default_setup 00:03:59.433 ************************************ 00:03:59.692 13:26:47 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:59.692 13:26:47 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:59.693 13:26:47 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:59.693 13:26:47 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:59.693 13:26:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:59.693 ************************************ 00:03:59.693 START TEST per_node_1G_alloc 00:03:59.693 ************************************ 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:59.693 13:26:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:03.054 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:03.054 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:04:03.054 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:03.054 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.054 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.054 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72143896 kB' 'MemAvailable: 76849876 kB' 'Buffers: 16532 kB' 'Cached: 15712548 kB' 'SwapCached: 0 kB' 'Active: 11680184 kB' 'Inactive: 4621360 kB' 'Active(anon): 11274740 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575752 kB' 'Mapped: 225000 kB' 'Shmem: 10702276 kB' 'KReclaimable: 515160 kB' 'Slab: 894632 kB' 'SReclaimable: 515160 kB' 'SUnreclaim: 379472 kB' 'KernelStack: 15664 kB' 'PageTables: 8508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12666204 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201632 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.055 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.319 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.320 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72145112 kB' 'MemAvailable: 76851092 kB' 'Buffers: 16532 kB' 'Cached: 15712552 kB' 'SwapCached: 0 kB' 'Active: 11681096 kB' 'Inactive: 4621360 kB' 'Active(anon): 11275652 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576796 kB' 'Mapped: 224968 kB' 'Shmem: 10702280 kB' 'KReclaimable: 515160 kB' 'Slab: 894628 kB' 'SReclaimable: 515160 kB' 'SUnreclaim: 379468 kB' 'KernelStack: 15664 kB' 'PageTables: 8504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12667712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201584 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.321 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72145584 kB' 'MemAvailable: 76851532 kB' 'Buffers: 16532 kB' 'Cached: 15712568 kB' 'SwapCached: 0 kB' 'Active: 11680636 kB' 'Inactive: 4621360 kB' 'Active(anon): 11275192 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576272 kB' 'Mapped: 224892 kB' 'Shmem: 10702296 kB' 'KReclaimable: 515128 kB' 'Slab: 894568 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379440 kB' 'KernelStack: 15744 kB' 'PageTables: 8872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12668848 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201712 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.322 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.323 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:03.324 nr_hugepages=1024 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:03.324 resv_hugepages=0 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:03.324 surplus_hugepages=0 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:03.324 anon_hugepages=0 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72148792 kB' 'MemAvailable: 76854740 kB' 'Buffers: 16532 kB' 'Cached: 15712592 kB' 'SwapCached: 0 kB' 'Active: 11680804 kB' 'Inactive: 4621360 kB' 'Active(anon): 11275360 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576156 kB' 'Mapped: 224892 kB' 'Shmem: 10702320 kB' 'KReclaimable: 515128 kB' 'Slab: 894568 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379440 kB' 'KernelStack: 15744 kB' 'PageTables: 8280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12668868 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201680 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.324 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.325 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069932 kB' 'MemFree: 34552324 kB' 'MemUsed: 13517608 kB' 'SwapCached: 0 kB' 'Active: 8094660 kB' 'Inactive: 3476228 kB' 'Active(anon): 7910640 kB' 'Inactive(anon): 0 kB' 'Active(file): 184020 kB' 'Inactive(file): 3476228 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11189276 kB' 'Mapped: 140356 kB' 'AnonPages: 384564 kB' 'Shmem: 7529028 kB' 'KernelStack: 9272 kB' 'PageTables: 5720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186324 kB' 'Slab: 416092 kB' 'SReclaimable: 186324 kB' 'SUnreclaim: 229768 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.326 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:03.327 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223604 kB' 'MemFree: 37599076 kB' 'MemUsed: 6624528 kB' 'SwapCached: 0 kB' 'Active: 3586044 kB' 'Inactive: 1145132 kB' 'Active(anon): 3364620 kB' 'Inactive(anon): 0 kB' 'Active(file): 221424 kB' 'Inactive(file): 1145132 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4539888 kB' 'Mapped: 84536 kB' 'AnonPages: 191364 kB' 'Shmem: 3173332 kB' 'KernelStack: 6488 kB' 'PageTables: 2884 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328804 kB' 'Slab: 478476 kB' 'SReclaimable: 328804 kB' 'SUnreclaim: 149672 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.328 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:03.329 node0=512 expecting 512 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:03.329 node1=512 expecting 512 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:03.329 00:04:03.329 real 0m3.790s 00:04:03.329 user 0m1.450s 00:04:03.329 sys 0m2.416s 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:03.329 13:26:50 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:03.329 ************************************ 00:04:03.329 END TEST per_node_1G_alloc 00:04:03.329 ************************************ 00:04:03.329 13:26:50 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:03.329 13:26:50 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:03.329 13:26:50 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:03.329 13:26:50 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:03.329 13:26:50 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:03.588 ************************************ 00:04:03.588 START TEST even_2G_alloc 00:04:03.588 ************************************ 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:03.588 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:03.589 13:26:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:06.876 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:06.876 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:04:07.136 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:07.136 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.136 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.136 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72156272 kB' 'MemAvailable: 76862220 kB' 'Buffers: 16532 kB' 'Cached: 15712700 kB' 'SwapCached: 0 kB' 'Active: 11680788 kB' 'Inactive: 4621360 kB' 'Active(anon): 11275344 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576192 kB' 'Mapped: 224908 kB' 'Shmem: 10702428 kB' 'KReclaimable: 515128 kB' 'Slab: 894464 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379336 kB' 'KernelStack: 15680 kB' 'PageTables: 8616 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12668240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201552 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.137 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72158548 kB' 'MemAvailable: 76864496 kB' 'Buffers: 16532 kB' 'Cached: 15712704 kB' 'SwapCached: 0 kB' 'Active: 11681588 kB' 'Inactive: 4621360 kB' 'Active(anon): 11276144 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577100 kB' 'Mapped: 224940 kB' 'Shmem: 10702432 kB' 'KReclaimable: 515128 kB' 'Slab: 894448 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379320 kB' 'KernelStack: 15968 kB' 'PageTables: 9220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12669372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201664 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.138 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72157696 kB' 'MemAvailable: 76863644 kB' 'Buffers: 16532 kB' 'Cached: 15712720 kB' 'SwapCached: 0 kB' 'Active: 11682128 kB' 'Inactive: 4621360 kB' 'Active(anon): 11276684 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577576 kB' 'Mapped: 224904 kB' 'Shmem: 10702448 kB' 'KReclaimable: 515128 kB' 'Slab: 894448 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379320 kB' 'KernelStack: 16080 kB' 'PageTables: 9468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12669392 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201648 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.139 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.400 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:07.401 nr_hugepages=1024 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:07.401 resv_hugepages=0 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:07.401 surplus_hugepages=0 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:07.401 anon_hugepages=0 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72159020 kB' 'MemAvailable: 76864968 kB' 'Buffers: 16532 kB' 'Cached: 15712744 kB' 'SwapCached: 0 kB' 'Active: 11682240 kB' 'Inactive: 4621360 kB' 'Active(anon): 11276796 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577652 kB' 'Mapped: 224904 kB' 'Shmem: 10702472 kB' 'KReclaimable: 515128 kB' 'Slab: 894352 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379224 kB' 'KernelStack: 16144 kB' 'PageTables: 9600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12669416 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201744 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.401 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069932 kB' 'MemFree: 34555024 kB' 'MemUsed: 13514908 kB' 'SwapCached: 0 kB' 'Active: 8096044 kB' 'Inactive: 3476228 kB' 'Active(anon): 7912024 kB' 'Inactive(anon): 0 kB' 'Active(file): 184020 kB' 'Inactive(file): 3476228 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11189304 kB' 'Mapped: 140364 kB' 'AnonPages: 386168 kB' 'Shmem: 7529056 kB' 'KernelStack: 9352 kB' 'PageTables: 6296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186324 kB' 'Slab: 416132 kB' 'SReclaimable: 186324 kB' 'SUnreclaim: 229808 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.402 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223604 kB' 'MemFree: 37605772 kB' 'MemUsed: 6617832 kB' 'SwapCached: 0 kB' 'Active: 3586204 kB' 'Inactive: 1145132 kB' 'Active(anon): 3364780 kB' 'Inactive(anon): 0 kB' 'Active(file): 221424 kB' 'Inactive(file): 1145132 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4540012 kB' 'Mapped: 84556 kB' 'AnonPages: 191452 kB' 'Shmem: 3173456 kB' 'KernelStack: 6536 kB' 'PageTables: 3044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328804 kB' 'Slab: 478412 kB' 'SReclaimable: 328804 kB' 'SUnreclaim: 149608 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.403 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:07.404 node0=512 expecting 512 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:07.404 node1=512 expecting 512 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:07.404 00:04:07.404 real 0m3.908s 00:04:07.404 user 0m1.498s 00:04:07.404 sys 0m2.515s 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:07.404 13:26:54 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:07.404 ************************************ 00:04:07.404 END TEST even_2G_alloc 00:04:07.404 ************************************ 00:04:07.404 13:26:54 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:07.404 13:26:54 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:07.404 13:26:54 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:07.404 13:26:54 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:07.404 13:26:54 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:07.404 ************************************ 00:04:07.404 START TEST odd_alloc 00:04:07.404 ************************************ 00:04:07.404 13:26:54 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:07.404 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:07.404 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:07.404 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:07.404 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:07.404 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:07.405 13:26:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:11.603 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:11.603 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:04:11.603 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:11.603 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:11.603 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.603 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72170580 kB' 'MemAvailable: 76876528 kB' 'Buffers: 16532 kB' 'Cached: 15712856 kB' 'SwapCached: 0 kB' 'Active: 11681904 kB' 'Inactive: 4621360 kB' 'Active(anon): 11276460 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576568 kB' 'Mapped: 224724 kB' 'Shmem: 10702584 kB' 'KReclaimable: 515128 kB' 'Slab: 894384 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379256 kB' 'KernelStack: 15808 kB' 'PageTables: 8684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 12670164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201712 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.604 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72171028 kB' 'MemAvailable: 76876976 kB' 'Buffers: 16532 kB' 'Cached: 15712860 kB' 'SwapCached: 0 kB' 'Active: 11682788 kB' 'Inactive: 4621360 kB' 'Active(anon): 11277344 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577980 kB' 'Mapped: 224696 kB' 'Shmem: 10702588 kB' 'KReclaimable: 515128 kB' 'Slab: 894384 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379256 kB' 'KernelStack: 16144 kB' 'PageTables: 9604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 12670180 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201856 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.605 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.606 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72176732 kB' 'MemAvailable: 76882680 kB' 'Buffers: 16532 kB' 'Cached: 15712876 kB' 'SwapCached: 0 kB' 'Active: 11685376 kB' 'Inactive: 4621360 kB' 'Active(anon): 11279932 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580584 kB' 'Mapped: 225196 kB' 'Shmem: 10702604 kB' 'KReclaimable: 515128 kB' 'Slab: 894348 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 379220 kB' 'KernelStack: 16320 kB' 'PageTables: 10060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 12673272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201744 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.607 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:11.608 nr_hugepages=1025 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:11.608 resv_hugepages=0 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:11.608 surplus_hugepages=0 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:11.608 anon_hugepages=0 00:04:11.608 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72176052 kB' 'MemAvailable: 76882000 kB' 'Buffers: 16532 kB' 'Cached: 15712896 kB' 'SwapCached: 0 kB' 'Active: 11682172 kB' 'Inactive: 4621360 kB' 'Active(anon): 11276728 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577320 kB' 'Mapped: 224692 kB' 'Shmem: 10702624 kB' 'KReclaimable: 515128 kB' 'Slab: 893996 kB' 'SReclaimable: 515128 kB' 'SUnreclaim: 378868 kB' 'KernelStack: 16048 kB' 'PageTables: 9280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 12670220 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201712 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.609 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069932 kB' 'MemFree: 34558132 kB' 'MemUsed: 13511800 kB' 'SwapCached: 0 kB' 'Active: 8096016 kB' 'Inactive: 3476228 kB' 'Active(anon): 7911996 kB' 'Inactive(anon): 0 kB' 'Active(file): 184020 kB' 'Inactive(file): 3476228 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11189340 kB' 'Mapped: 140140 kB' 'AnonPages: 386076 kB' 'Shmem: 7529092 kB' 'KernelStack: 9256 kB' 'PageTables: 5788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186324 kB' 'Slab: 415696 kB' 'SReclaimable: 186324 kB' 'SUnreclaim: 229372 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.610 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.611 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223604 kB' 'MemFree: 37617476 kB' 'MemUsed: 6606128 kB' 'SwapCached: 0 kB' 'Active: 3585948 kB' 'Inactive: 1145132 kB' 'Active(anon): 3364524 kB' 'Inactive(anon): 0 kB' 'Active(file): 221424 kB' 'Inactive(file): 1145132 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4540124 kB' 'Mapped: 84552 kB' 'AnonPages: 191032 kB' 'Shmem: 3173568 kB' 'KernelStack: 6552 kB' 'PageTables: 3132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328804 kB' 'Slab: 478292 kB' 'SReclaimable: 328804 kB' 'SUnreclaim: 149488 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.612 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:11.613 node0=512 expecting 513 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:11.613 node1=513 expecting 512 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:11.613 00:04:11.613 real 0m3.884s 00:04:11.613 user 0m1.507s 00:04:11.613 sys 0m2.475s 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:11.613 13:26:58 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:11.613 ************************************ 00:04:11.613 END TEST odd_alloc 00:04:11.613 ************************************ 00:04:11.613 13:26:58 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:11.613 13:26:58 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:11.613 13:26:58 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:11.613 13:26:58 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:11.613 13:26:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:11.613 ************************************ 00:04:11.613 START TEST custom_alloc 00:04:11.613 ************************************ 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:11.613 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.614 13:26:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:14.904 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:14.904 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:04:14.904 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:14.904 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.904 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:14.904 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 71033480 kB' 'MemAvailable: 75739396 kB' 'Buffers: 16532 kB' 'Cached: 15713016 kB' 'SwapCached: 0 kB' 'Active: 11690372 kB' 'Inactive: 4621360 kB' 'Active(anon): 11284928 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585472 kB' 'Mapped: 225740 kB' 'Shmem: 10702744 kB' 'KReclaimable: 515096 kB' 'Slab: 895104 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 380008 kB' 'KernelStack: 15776 kB' 'PageTables: 8856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 12677288 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201668 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.905 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 71038676 kB' 'MemAvailable: 75744592 kB' 'Buffers: 16532 kB' 'Cached: 15713020 kB' 'SwapCached: 0 kB' 'Active: 11691016 kB' 'Inactive: 4621360 kB' 'Active(anon): 11285572 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 586152 kB' 'Mapped: 225740 kB' 'Shmem: 10702748 kB' 'KReclaimable: 515096 kB' 'Slab: 895088 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379992 kB' 'KernelStack: 15776 kB' 'PageTables: 8848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 12677304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201620 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.906 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.907 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 71039264 kB' 'MemAvailable: 75745180 kB' 'Buffers: 16532 kB' 'Cached: 15713036 kB' 'SwapCached: 0 kB' 'Active: 11690168 kB' 'Inactive: 4621360 kB' 'Active(anon): 11284724 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585276 kB' 'Mapped: 225696 kB' 'Shmem: 10702764 kB' 'KReclaimable: 515096 kB' 'Slab: 895132 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 380036 kB' 'KernelStack: 15792 kB' 'PageTables: 8904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 12677324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201620 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.908 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.909 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:14.910 nr_hugepages=1536 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:14.910 resv_hugepages=0 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:14.910 surplus_hugepages=0 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:14.910 anon_hugepages=0 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 71039264 kB' 'MemAvailable: 75745180 kB' 'Buffers: 16532 kB' 'Cached: 15713036 kB' 'SwapCached: 0 kB' 'Active: 11690256 kB' 'Inactive: 4621360 kB' 'Active(anon): 11284812 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585420 kB' 'Mapped: 225696 kB' 'Shmem: 10702764 kB' 'KReclaimable: 515096 kB' 'Slab: 895132 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 380036 kB' 'KernelStack: 15808 kB' 'PageTables: 8956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 12677348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201636 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.910 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:14.911 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069932 kB' 'MemFree: 34485184 kB' 'MemUsed: 13584748 kB' 'SwapCached: 0 kB' 'Active: 8096812 kB' 'Inactive: 3476228 kB' 'Active(anon): 7912792 kB' 'Inactive(anon): 0 kB' 'Active(file): 184020 kB' 'Inactive(file): 3476228 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11189388 kB' 'Mapped: 141160 kB' 'AnonPages: 386912 kB' 'Shmem: 7529140 kB' 'KernelStack: 9208 kB' 'PageTables: 5720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186324 kB' 'Slab: 416400 kB' 'SReclaimable: 186324 kB' 'SUnreclaim: 230076 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.912 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223604 kB' 'MemFree: 36554080 kB' 'MemUsed: 7669524 kB' 'SwapCached: 0 kB' 'Active: 3594264 kB' 'Inactive: 1145132 kB' 'Active(anon): 3372840 kB' 'Inactive(anon): 0 kB' 'Active(file): 221424 kB' 'Inactive(file): 1145132 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4540180 kB' 'Mapped: 84552 kB' 'AnonPages: 199380 kB' 'Shmem: 3173624 kB' 'KernelStack: 6616 kB' 'PageTables: 3352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328772 kB' 'Slab: 478732 kB' 'SReclaimable: 328772 kB' 'SUnreclaim: 149960 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.913 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.173 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:15.174 node0=512 expecting 512 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:15.174 node1=1024 expecting 1024 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:15.174 00:04:15.174 real 0m3.633s 00:04:15.174 user 0m1.382s 00:04:15.174 sys 0m2.334s 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:15.174 13:27:02 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:15.174 ************************************ 00:04:15.174 END TEST custom_alloc 00:04:15.174 ************************************ 00:04:15.174 13:27:02 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:15.174 13:27:02 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:15.174 13:27:02 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:15.174 13:27:02 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:15.174 13:27:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:15.174 ************************************ 00:04:15.174 START TEST no_shrink_alloc 00:04:15.174 ************************************ 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.174 13:27:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:18.467 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:18.467 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:04:18.467 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:18.467 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:18.467 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72069944 kB' 'MemAvailable: 76775860 kB' 'Buffers: 16532 kB' 'Cached: 15713172 kB' 'SwapCached: 0 kB' 'Active: 11685532 kB' 'Inactive: 4621360 kB' 'Active(anon): 11280088 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579396 kB' 'Mapped: 225048 kB' 'Shmem: 10702900 kB' 'KReclaimable: 515096 kB' 'Slab: 894736 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379640 kB' 'KernelStack: 15744 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12670120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201648 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.467 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.468 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72070544 kB' 'MemAvailable: 76776460 kB' 'Buffers: 16532 kB' 'Cached: 15713172 kB' 'SwapCached: 0 kB' 'Active: 11685120 kB' 'Inactive: 4621360 kB' 'Active(anon): 11279676 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579984 kB' 'Mapped: 225956 kB' 'Shmem: 10702900 kB' 'KReclaimable: 515096 kB' 'Slab: 894920 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379824 kB' 'KernelStack: 15792 kB' 'PageTables: 8800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12672956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201632 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.469 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.470 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72070044 kB' 'MemAvailable: 76775960 kB' 'Buffers: 16532 kB' 'Cached: 15713176 kB' 'SwapCached: 0 kB' 'Active: 11684328 kB' 'Inactive: 4621360 kB' 'Active(anon): 11278884 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579220 kB' 'Mapped: 224948 kB' 'Shmem: 10702904 kB' 'KReclaimable: 515096 kB' 'Slab: 894920 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379824 kB' 'KernelStack: 15712 kB' 'PageTables: 8548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12670164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201616 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.471 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.472 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:18.473 nr_hugepages=1024 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:18.473 resv_hugepages=0 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:18.473 surplus_hugepages=0 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:18.473 anon_hugepages=0 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72069176 kB' 'MemAvailable: 76775092 kB' 'Buffers: 16532 kB' 'Cached: 15713212 kB' 'SwapCached: 0 kB' 'Active: 11683964 kB' 'Inactive: 4621360 kB' 'Active(anon): 11278520 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579312 kB' 'Mapped: 224948 kB' 'Shmem: 10702940 kB' 'KReclaimable: 515096 kB' 'Slab: 894920 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379824 kB' 'KernelStack: 15728 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12670344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201648 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.473 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:18.474 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069932 kB' 'MemFree: 33456088 kB' 'MemUsed: 14613844 kB' 'SwapCached: 0 kB' 'Active: 8096076 kB' 'Inactive: 3476228 kB' 'Active(anon): 7912056 kB' 'Inactive(anon): 0 kB' 'Active(file): 184020 kB' 'Inactive(file): 3476228 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11189472 kB' 'Mapped: 140412 kB' 'AnonPages: 386020 kB' 'Shmem: 7529224 kB' 'KernelStack: 9144 kB' 'PageTables: 5476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186324 kB' 'Slab: 416172 kB' 'SReclaimable: 186324 kB' 'SUnreclaim: 229848 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.475 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:18.476 node0=1024 expecting 1024 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:18.476 13:27:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:21.769 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:21.769 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:04:21.769 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:21.769 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.769 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.769 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.769 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72106748 kB' 'MemAvailable: 76812664 kB' 'Buffers: 16532 kB' 'Cached: 15713304 kB' 'SwapCached: 0 kB' 'Active: 11685028 kB' 'Inactive: 4621360 kB' 'Active(anon): 11279584 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579756 kB' 'Mapped: 225164 kB' 'Shmem: 10703032 kB' 'KReclaimable: 515096 kB' 'Slab: 894488 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379392 kB' 'KernelStack: 15712 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12669568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201648 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.770 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72107256 kB' 'MemAvailable: 76813172 kB' 'Buffers: 16532 kB' 'Cached: 15713304 kB' 'SwapCached: 0 kB' 'Active: 11685360 kB' 'Inactive: 4621360 kB' 'Active(anon): 11279916 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580012 kB' 'Mapped: 224984 kB' 'Shmem: 10703032 kB' 'KReclaimable: 515096 kB' 'Slab: 894488 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379392 kB' 'KernelStack: 15712 kB' 'PageTables: 8592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12669584 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201648 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.771 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.772 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72112852 kB' 'MemAvailable: 76818768 kB' 'Buffers: 16532 kB' 'Cached: 15713304 kB' 'SwapCached: 0 kB' 'Active: 11685108 kB' 'Inactive: 4621360 kB' 'Active(anon): 11279664 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579788 kB' 'Mapped: 224984 kB' 'Shmem: 10703032 kB' 'KReclaimable: 515096 kB' 'Slab: 894456 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379360 kB' 'KernelStack: 15680 kB' 'PageTables: 8476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12669608 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201616 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.773 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.774 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:21.775 nr_hugepages=1024 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:21.775 resv_hugepages=0 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:21.775 surplus_hugepages=0 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:21.775 anon_hugepages=0 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72113152 kB' 'MemAvailable: 76819068 kB' 'Buffers: 16532 kB' 'Cached: 15713364 kB' 'SwapCached: 0 kB' 'Active: 11684064 kB' 'Inactive: 4621360 kB' 'Active(anon): 11278620 kB' 'Inactive(anon): 0 kB' 'Active(file): 405444 kB' 'Inactive(file): 4621360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 578720 kB' 'Mapped: 224956 kB' 'Shmem: 10703092 kB' 'KReclaimable: 515096 kB' 'Slab: 894520 kB' 'SReclaimable: 515096 kB' 'SUnreclaim: 379424 kB' 'KernelStack: 15712 kB' 'PageTables: 8564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 12669628 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201616 kB' 'VmallocChunk: 0 kB' 'Percpu: 102080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1285588 kB' 'DirectMap2M: 22507520 kB' 'DirectMap1G: 77594624 kB' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.775 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.776 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069932 kB' 'MemFree: 33480692 kB' 'MemUsed: 14589240 kB' 'SwapCached: 0 kB' 'Active: 8096392 kB' 'Inactive: 3476228 kB' 'Active(anon): 7912372 kB' 'Inactive(anon): 0 kB' 'Active(file): 184020 kB' 'Inactive(file): 3476228 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11189460 kB' 'Mapped: 140420 kB' 'AnonPages: 386256 kB' 'Shmem: 7529212 kB' 'KernelStack: 9128 kB' 'PageTables: 5476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186324 kB' 'Slab: 415964 kB' 'SReclaimable: 186324 kB' 'SUnreclaim: 229640 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.777 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:21.778 node0=1024 expecting 1024 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:21.778 00:04:21.778 real 0m6.579s 00:04:21.778 user 0m2.473s 00:04:21.778 sys 0m4.174s 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:21.778 13:27:09 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:21.778 ************************************ 00:04:21.778 END TEST no_shrink_alloc 00:04:21.778 ************************************ 00:04:21.778 13:27:09 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:21.778 13:27:09 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:21.778 00:04:21.778 real 0m27.543s 00:04:21.778 user 0m10.010s 00:04:21.778 sys 0m16.817s 00:04:21.778 13:27:09 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:21.778 13:27:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:21.778 ************************************ 00:04:21.778 END TEST hugepages 00:04:21.778 ************************************ 00:04:21.778 13:27:09 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:21.778 13:27:09 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:21.778 13:27:09 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:21.778 13:27:09 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:21.778 13:27:09 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:21.778 ************************************ 00:04:21.778 START TEST driver 00:04:21.778 ************************************ 00:04:21.778 13:27:09 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:22.037 * Looking for test storage... 00:04:22.037 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:22.037 13:27:09 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:22.037 13:27:09 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:22.037 13:27:09 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:27.303 13:27:14 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:27.303 13:27:14 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:27.303 13:27:14 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:27.303 13:27:14 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:27.303 ************************************ 00:04:27.303 START TEST guess_driver 00:04:27.303 ************************************ 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 215 > 0 )) 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:27.303 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:27.303 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:27.303 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:27.303 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:27.303 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:27.303 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:27.303 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:27.303 Looking for driver=vfio-pci 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.303 13:27:14 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:27.304 13:27:14 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:27.304 13:27:14 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.879 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.138 13:27:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.517 13:27:18 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:31.517 13:27:18 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:31.517 13:27:18 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.517 13:27:18 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:31.517 13:27:18 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:31.517 13:27:18 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:31.517 13:27:18 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:36.879 00:04:36.879 real 0m9.361s 00:04:36.879 user 0m2.560s 00:04:36.879 sys 0m4.906s 00:04:36.879 13:27:23 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:36.879 13:27:23 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:36.879 ************************************ 00:04:36.879 END TEST guess_driver 00:04:36.879 ************************************ 00:04:36.879 13:27:23 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:04:36.879 00:04:36.879 real 0m14.261s 00:04:36.879 user 0m4.006s 00:04:36.879 sys 0m7.679s 00:04:36.879 13:27:23 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:36.879 13:27:23 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:36.879 ************************************ 00:04:36.879 END TEST driver 00:04:36.879 ************************************ 00:04:36.879 13:27:23 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:36.879 13:27:23 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:36.879 13:27:23 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:36.879 13:27:23 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.879 13:27:23 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:36.879 ************************************ 00:04:36.879 START TEST devices 00:04:36.879 ************************************ 00:04:36.879 13:27:23 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:36.879 * Looking for test storage... 00:04:36.879 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:36.879 13:27:23 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:36.879 13:27:23 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:36.879 13:27:23 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:36.879 13:27:23 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:40.169 13:27:27 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:40.169 13:27:27 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:40.169 13:27:27 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:40.169 No valid GPT data, bailing 00:04:40.169 13:27:27 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:40.169 13:27:27 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:40.169 13:27:27 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:40.169 13:27:27 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:40.169 13:27:27 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:40.169 13:27:27 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:40.169 13:27:27 setup.sh.devices -- setup/common.sh@80 -- # echo 3840755982336 00:04:40.429 13:27:27 setup.sh.devices -- setup/devices.sh@204 -- # (( 3840755982336 >= min_disk_size )) 00:04:40.429 13:27:27 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:40.429 13:27:27 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:40.429 13:27:27 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:40.429 13:27:27 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:40.429 13:27:27 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:40.429 13:27:27 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:40.429 13:27:27 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:40.429 13:27:27 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:40.429 ************************************ 00:04:40.429 START TEST nvme_mount 00:04:40.429 ************************************ 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:40.429 13:27:27 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:41.367 Creating new GPT entries in memory. 00:04:41.367 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:41.367 other utilities. 00:04:41.367 13:27:28 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:41.367 13:27:28 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:41.367 13:27:28 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:41.367 13:27:28 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:41.367 13:27:28 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:42.305 Creating new GPT entries in memory. 00:04:42.305 The operation has completed successfully. 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 4114171 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:42.305 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:42.564 13:27:29 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:45.851 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:45.851 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:46.108 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:46.108 /dev/nvme0n1: 8 bytes were erased at offset 0x37e3ee55e00 (gpt): 45 46 49 20 50 41 52 54 00:04:46.108 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:46.108 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:46.108 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:46.108 13:27:33 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:46.108 13:27:33 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.108 13:27:33 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:46.108 13:27:33 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.365 13:27:33 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.649 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.650 13:27:36 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.970 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:52.971 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:52.971 00:04:52.971 real 0m12.730s 00:04:52.971 user 0m3.626s 00:04:52.971 sys 0m6.998s 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:52.971 13:27:40 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:52.971 ************************************ 00:04:52.971 END TEST nvme_mount 00:04:52.971 ************************************ 00:04:53.230 13:27:40 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:04:53.230 13:27:40 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:53.230 13:27:40 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:53.230 13:27:40 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:53.230 13:27:40 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:53.230 ************************************ 00:04:53.230 START TEST dm_mount 00:04:53.230 ************************************ 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:53.230 13:27:40 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:54.169 Creating new GPT entries in memory. 00:04:54.169 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:54.169 other utilities. 00:04:54.169 13:27:41 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:54.169 13:27:41 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.169 13:27:41 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:54.169 13:27:41 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:54.169 13:27:41 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:55.107 Creating new GPT entries in memory. 00:04:55.107 The operation has completed successfully. 00:04:55.107 13:27:42 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:55.107 13:27:42 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:55.107 13:27:42 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:55.107 13:27:42 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:55.107 13:27:42 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:56.484 The operation has completed successfully. 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 4118414 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.484 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.485 13:27:43 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:59.772 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.773 13:27:47 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:ae:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:03.058 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:03.058 00:05:03.058 real 0m9.668s 00:05:03.058 user 0m2.367s 00:05:03.058 sys 0m4.333s 00:05:03.058 13:27:50 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:03.059 13:27:50 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:03.059 ************************************ 00:05:03.059 END TEST dm_mount 00:05:03.059 ************************************ 00:05:03.059 13:27:50 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:03.059 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:03.059 /dev/nvme0n1: 8 bytes were erased at offset 0x37e3ee55e00 (gpt): 45 46 49 20 50 41 52 54 00:05:03.059 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:03.059 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.059 13:27:50 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:03.059 00:05:03.059 real 0m26.978s 00:05:03.059 user 0m7.556s 00:05:03.059 sys 0m14.273s 00:05:03.059 13:27:50 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:03.059 13:27:50 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:03.059 ************************************ 00:05:03.059 END TEST devices 00:05:03.059 ************************************ 00:05:03.317 13:27:50 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:03.317 00:05:03.317 real 1m33.068s 00:05:03.317 user 0m29.259s 00:05:03.317 sys 0m53.754s 00:05:03.317 13:27:50 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:03.317 13:27:50 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:03.317 ************************************ 00:05:03.317 END TEST setup.sh 00:05:03.317 ************************************ 00:05:03.317 13:27:50 -- common/autotest_common.sh@1142 -- # return 0 00:05:03.317 13:27:50 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:06.592 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:06.592 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:05:06.592 Hugepages 00:05:06.592 node hugesize free / total 00:05:06.592 node0 1048576kB 0 / 0 00:05:06.592 node0 2048kB 1024 / 1024 00:05:06.592 node1 1048576kB 0 / 0 00:05:06.592 node1 2048kB 1024 / 1024 00:05:06.592 00:05:06.592 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:06.592 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:06.592 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:06.592 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:06.592 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:06.592 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:06.592 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:06.592 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:06.592 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:06.850 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:06.850 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:06.850 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:06.850 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:06.850 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:06.850 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:06.850 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:06.850 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:06.850 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:06.850 VMD 0000:85:05.5 8086 201d 1 - - - 00:05:06.850 VMD 0000:ae:05.5 8086 201d 1 - - - 00:05:06.850 13:27:54 -- spdk/autotest.sh@130 -- # uname -s 00:05:06.850 13:27:54 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:06.850 13:27:54 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:06.850 13:27:54 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:10.194 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:10.195 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:05:10.195 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:10.195 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:11.572 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:11.572 13:27:59 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:12.509 13:28:00 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:12.509 13:28:00 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:12.509 13:28:00 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:12.509 13:28:00 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:12.509 13:28:00 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:12.509 13:28:00 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:12.509 13:28:00 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:12.509 13:28:00 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:12.509 13:28:00 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:12.767 13:28:00 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:12.767 13:28:00 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:12.767 13:28:00 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:16.050 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:16.050 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:05:16.050 Waiting for block devices as requested 00:05:16.050 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:16.308 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:16.308 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:16.308 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:16.566 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:16.566 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:16.566 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:16.823 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:16.823 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:16.823 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:17.082 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:17.082 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:17.082 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:17.340 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:17.340 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:17.340 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:17.599 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:17.599 13:28:05 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:17.599 13:28:05 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:17.599 13:28:05 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:17.599 13:28:05 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:17.599 13:28:05 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:17.599 13:28:05 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:17.599 13:28:05 -- common/autotest_common.sh@1545 -- # oacs=' 0x1e' 00:05:17.599 13:28:05 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:17.599 13:28:05 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:17.599 13:28:05 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:17.599 13:28:05 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:17.599 13:28:05 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:17.599 13:28:05 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:17.599 13:28:05 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:17.599 13:28:05 -- common/autotest_common.sh@1557 -- # continue 00:05:17.599 13:28:05 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:17.599 13:28:05 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:17.599 13:28:05 -- common/autotest_common.sh@10 -- # set +x 00:05:17.599 13:28:05 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:17.599 13:28:05 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:17.599 13:28:05 -- common/autotest_common.sh@10 -- # set +x 00:05:17.599 13:28:05 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:21.786 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:21.786 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:05:21.786 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:21.786 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:22.354 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:22.613 13:28:10 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:22.613 13:28:10 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:22.613 13:28:10 -- common/autotest_common.sh@10 -- # set +x 00:05:22.613 13:28:10 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:22.613 13:28:10 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:05:22.613 13:28:10 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:05:22.613 13:28:10 -- common/autotest_common.sh@1577 -- # bdfs=() 00:05:22.613 13:28:10 -- common/autotest_common.sh@1577 -- # local bdfs 00:05:22.613 13:28:10 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:05:22.613 13:28:10 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:22.613 13:28:10 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:22.613 13:28:10 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:22.613 13:28:10 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:22.613 13:28:10 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:22.613 13:28:10 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:22.613 13:28:10 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:22.613 13:28:10 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:05:22.613 13:28:10 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:05:22.613 13:28:10 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:05:22.613 13:28:10 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:05:22.613 13:28:10 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:05:22.613 13:28:10 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:05:22.613 13:28:10 -- common/autotest_common.sh@1593 -- # return 0 00:05:22.613 13:28:10 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:22.613 13:28:10 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:22.613 13:28:10 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:22.613 13:28:10 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:22.614 13:28:10 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:23.182 Restarting all devices. 00:05:27.371 lstat() error: No such file or directory 00:05:27.371 QAT Error: No GENERAL section found 00:05:27.371 Failed to configure qat_dev0 00:05:27.371 lstat() error: No such file or directory 00:05:27.371 QAT Error: No GENERAL section found 00:05:27.371 Failed to configure qat_dev1 00:05:27.371 lstat() error: No such file or directory 00:05:27.371 QAT Error: No GENERAL section found 00:05:27.371 Failed to configure qat_dev2 00:05:27.371 enable sriov 00:05:27.371 Checking status of all devices. 00:05:27.371 There is 3 QAT acceleration device(s) in the system: 00:05:27.371 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:05:27.371 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:05:27.371 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:05:27.371 0000:3d:00.0 set to 16 VFs 00:05:27.629 0000:3f:00.0 set to 16 VFs 00:05:28.197 0000:da:00.0 set to 16 VFs 00:05:28.455 Properly configured the qat device with driver uio_pci_generic. 00:05:28.455 13:28:15 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:28.455 13:28:15 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:28.455 13:28:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.455 13:28:15 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:28.455 13:28:15 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:28.455 13:28:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:28.455 13:28:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:28.455 13:28:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.455 ************************************ 00:05:28.455 START TEST env 00:05:28.455 ************************************ 00:05:28.455 13:28:15 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:28.455 * Looking for test storage... 00:05:28.455 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:28.455 13:28:16 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:28.455 13:28:16 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:28.455 13:28:16 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:28.455 13:28:16 env -- common/autotest_common.sh@10 -- # set +x 00:05:28.713 ************************************ 00:05:28.713 START TEST env_memory 00:05:28.713 ************************************ 00:05:28.713 13:28:16 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:28.713 00:05:28.713 00:05:28.713 CUnit - A unit testing framework for C - Version 2.1-3 00:05:28.713 http://cunit.sourceforge.net/ 00:05:28.713 00:05:28.713 00:05:28.713 Suite: memory 00:05:28.714 Test: alloc and free memory map ...[2024-07-15 13:28:16.124468] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:28.714 passed 00:05:28.714 Test: mem map translation ...[2024-07-15 13:28:16.143717] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:28.714 [2024-07-15 13:28:16.143733] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:28.714 [2024-07-15 13:28:16.143786] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:28.714 [2024-07-15 13:28:16.143796] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:28.714 passed 00:05:28.714 Test: mem map registration ...[2024-07-15 13:28:16.179811] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:28.714 [2024-07-15 13:28:16.179827] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:28.714 passed 00:05:28.714 Test: mem map adjacent registrations ...passed 00:05:28.714 00:05:28.714 Run Summary: Type Total Ran Passed Failed Inactive 00:05:28.714 suites 1 1 n/a 0 0 00:05:28.714 tests 4 4 4 0 0 00:05:28.714 asserts 152 152 152 0 n/a 00:05:28.714 00:05:28.714 Elapsed time = 0.134 seconds 00:05:28.714 00:05:28.714 real 0m0.148s 00:05:28.714 user 0m0.133s 00:05:28.714 sys 0m0.015s 00:05:28.714 13:28:16 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:28.714 13:28:16 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:28.714 ************************************ 00:05:28.714 END TEST env_memory 00:05:28.714 ************************************ 00:05:28.714 13:28:16 env -- common/autotest_common.sh@1142 -- # return 0 00:05:28.714 13:28:16 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:28.714 13:28:16 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:28.714 13:28:16 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:28.714 13:28:16 env -- common/autotest_common.sh@10 -- # set +x 00:05:28.714 ************************************ 00:05:28.714 START TEST env_vtophys 00:05:28.714 ************************************ 00:05:28.714 13:28:16 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:28.973 EAL: lib.eal log level changed from notice to debug 00:05:28.973 EAL: Detected lcore 0 as core 0 on socket 0 00:05:28.973 EAL: Detected lcore 1 as core 1 on socket 0 00:05:28.973 EAL: Detected lcore 2 as core 2 on socket 0 00:05:28.973 EAL: Detected lcore 3 as core 3 on socket 0 00:05:28.973 EAL: Detected lcore 4 as core 4 on socket 0 00:05:28.973 EAL: Detected lcore 5 as core 8 on socket 0 00:05:28.973 EAL: Detected lcore 6 as core 9 on socket 0 00:05:28.973 EAL: Detected lcore 7 as core 10 on socket 0 00:05:28.973 EAL: Detected lcore 8 as core 11 on socket 0 00:05:28.973 EAL: Detected lcore 9 as core 16 on socket 0 00:05:28.973 EAL: Detected lcore 10 as core 17 on socket 0 00:05:28.973 EAL: Detected lcore 11 as core 18 on socket 0 00:05:28.973 EAL: Detected lcore 12 as core 19 on socket 0 00:05:28.973 EAL: Detected lcore 13 as core 20 on socket 0 00:05:28.973 EAL: Detected lcore 14 as core 24 on socket 0 00:05:28.973 EAL: Detected lcore 15 as core 25 on socket 0 00:05:28.973 EAL: Detected lcore 16 as core 26 on socket 0 00:05:28.974 EAL: Detected lcore 17 as core 27 on socket 0 00:05:28.974 EAL: Detected lcore 18 as core 0 on socket 1 00:05:28.974 EAL: Detected lcore 19 as core 1 on socket 1 00:05:28.974 EAL: Detected lcore 20 as core 2 on socket 1 00:05:28.974 EAL: Detected lcore 21 as core 3 on socket 1 00:05:28.974 EAL: Detected lcore 22 as core 4 on socket 1 00:05:28.974 EAL: Detected lcore 23 as core 8 on socket 1 00:05:28.974 EAL: Detected lcore 24 as core 9 on socket 1 00:05:28.974 EAL: Detected lcore 25 as core 10 on socket 1 00:05:28.974 EAL: Detected lcore 26 as core 11 on socket 1 00:05:28.974 EAL: Detected lcore 27 as core 16 on socket 1 00:05:28.974 EAL: Detected lcore 28 as core 17 on socket 1 00:05:28.974 EAL: Detected lcore 29 as core 18 on socket 1 00:05:28.974 EAL: Detected lcore 30 as core 19 on socket 1 00:05:28.974 EAL: Detected lcore 31 as core 20 on socket 1 00:05:28.974 EAL: Detected lcore 32 as core 24 on socket 1 00:05:28.974 EAL: Detected lcore 33 as core 25 on socket 1 00:05:28.974 EAL: Detected lcore 34 as core 26 on socket 1 00:05:28.974 EAL: Detected lcore 35 as core 27 on socket 1 00:05:28.974 EAL: Detected lcore 36 as core 0 on socket 0 00:05:28.974 EAL: Detected lcore 37 as core 1 on socket 0 00:05:28.974 EAL: Detected lcore 38 as core 2 on socket 0 00:05:28.974 EAL: Detected lcore 39 as core 3 on socket 0 00:05:28.974 EAL: Detected lcore 40 as core 4 on socket 0 00:05:28.974 EAL: Detected lcore 41 as core 8 on socket 0 00:05:28.974 EAL: Detected lcore 42 as core 9 on socket 0 00:05:28.974 EAL: Detected lcore 43 as core 10 on socket 0 00:05:28.974 EAL: Detected lcore 44 as core 11 on socket 0 00:05:28.974 EAL: Detected lcore 45 as core 16 on socket 0 00:05:28.974 EAL: Detected lcore 46 as core 17 on socket 0 00:05:28.974 EAL: Detected lcore 47 as core 18 on socket 0 00:05:28.974 EAL: Detected lcore 48 as core 19 on socket 0 00:05:28.974 EAL: Detected lcore 49 as core 20 on socket 0 00:05:28.974 EAL: Detected lcore 50 as core 24 on socket 0 00:05:28.974 EAL: Detected lcore 51 as core 25 on socket 0 00:05:28.974 EAL: Detected lcore 52 as core 26 on socket 0 00:05:28.974 EAL: Detected lcore 53 as core 27 on socket 0 00:05:28.974 EAL: Detected lcore 54 as core 0 on socket 1 00:05:28.974 EAL: Detected lcore 55 as core 1 on socket 1 00:05:28.974 EAL: Detected lcore 56 as core 2 on socket 1 00:05:28.974 EAL: Detected lcore 57 as core 3 on socket 1 00:05:28.974 EAL: Detected lcore 58 as core 4 on socket 1 00:05:28.974 EAL: Detected lcore 59 as core 8 on socket 1 00:05:28.974 EAL: Detected lcore 60 as core 9 on socket 1 00:05:28.974 EAL: Detected lcore 61 as core 10 on socket 1 00:05:28.974 EAL: Detected lcore 62 as core 11 on socket 1 00:05:28.974 EAL: Detected lcore 63 as core 16 on socket 1 00:05:28.974 EAL: Detected lcore 64 as core 17 on socket 1 00:05:28.974 EAL: Detected lcore 65 as core 18 on socket 1 00:05:28.974 EAL: Detected lcore 66 as core 19 on socket 1 00:05:28.974 EAL: Detected lcore 67 as core 20 on socket 1 00:05:28.974 EAL: Detected lcore 68 as core 24 on socket 1 00:05:28.974 EAL: Detected lcore 69 as core 25 on socket 1 00:05:28.974 EAL: Detected lcore 70 as core 26 on socket 1 00:05:28.974 EAL: Detected lcore 71 as core 27 on socket 1 00:05:28.974 EAL: Maximum logical cores by configuration: 128 00:05:28.974 EAL: Detected CPU lcores: 72 00:05:28.974 EAL: Detected NUMA nodes: 2 00:05:28.974 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:28.974 EAL: Detected shared linkage of DPDK 00:05:28.974 EAL: No shared files mode enabled, IPC will be disabled 00:05:28.974 EAL: No shared files mode enabled, IPC is disabled 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:05:28.974 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:05:28.974 EAL: Bus pci wants IOVA as 'PA' 00:05:28.974 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:28.974 EAL: Bus vdev wants IOVA as 'DC' 00:05:28.974 EAL: Selected IOVA mode 'PA' 00:05:28.974 EAL: Probing VFIO support... 00:05:28.974 EAL: IOMMU type 1 (Type 1) is supported 00:05:28.974 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:28.974 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:28.974 EAL: VFIO support initialized 00:05:28.974 EAL: Ask a virtual area of 0x2e000 bytes 00:05:28.974 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:28.974 EAL: Setting up physically contiguous memory... 00:05:28.974 EAL: Setting maximum number of open files to 524288 00:05:28.974 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:28.974 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:28.974 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:28.974 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:28.974 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.974 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:28.974 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.974 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.974 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:28.974 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:28.974 EAL: Hugepages will be freed exactly as allocated. 00:05:28.974 EAL: No shared files mode enabled, IPC is disabled 00:05:28.974 EAL: No shared files mode enabled, IPC is disabled 00:05:28.974 EAL: TSC frequency is ~2300000 KHz 00:05:28.975 EAL: Main lcore 0 is ready (tid=7f98421e1b00;cpuset=[0]) 00:05:28.975 EAL: Trying to obtain current memory policy. 00:05:28.975 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.975 EAL: Restoring previous memory policy: 0 00:05:28.975 EAL: request: mp_malloc_sync 00:05:28.975 EAL: No shared files mode enabled, IPC is disabled 00:05:28.975 EAL: Heap on socket 0 was expanded by 2MB 00:05:28.975 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001000000 00:05:28.975 EAL: PCI memory mapped at 0x202001001000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001002000 00:05:28.975 EAL: PCI memory mapped at 0x202001003000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001004000 00:05:28.975 EAL: PCI memory mapped at 0x202001005000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001006000 00:05:28.975 EAL: PCI memory mapped at 0x202001007000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001008000 00:05:28.975 EAL: PCI memory mapped at 0x202001009000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200100a000 00:05:28.975 EAL: PCI memory mapped at 0x20200100b000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200100c000 00:05:28.975 EAL: PCI memory mapped at 0x20200100d000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200100e000 00:05:28.975 EAL: PCI memory mapped at 0x20200100f000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001010000 00:05:28.975 EAL: PCI memory mapped at 0x202001011000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001012000 00:05:28.975 EAL: PCI memory mapped at 0x202001013000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001014000 00:05:28.975 EAL: PCI memory mapped at 0x202001015000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001016000 00:05:28.975 EAL: PCI memory mapped at 0x202001017000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001018000 00:05:28.975 EAL: PCI memory mapped at 0x202001019000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200101a000 00:05:28.975 EAL: PCI memory mapped at 0x20200101b000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200101c000 00:05:28.975 EAL: PCI memory mapped at 0x20200101d000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:28.975 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200101e000 00:05:28.975 EAL: PCI memory mapped at 0x20200101f000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001020000 00:05:28.975 EAL: PCI memory mapped at 0x202001021000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001022000 00:05:28.975 EAL: PCI memory mapped at 0x202001023000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001024000 00:05:28.975 EAL: PCI memory mapped at 0x202001025000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001026000 00:05:28.975 EAL: PCI memory mapped at 0x202001027000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001028000 00:05:28.975 EAL: PCI memory mapped at 0x202001029000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200102a000 00:05:28.975 EAL: PCI memory mapped at 0x20200102b000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200102c000 00:05:28.975 EAL: PCI memory mapped at 0x20200102d000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200102e000 00:05:28.975 EAL: PCI memory mapped at 0x20200102f000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001030000 00:05:28.975 EAL: PCI memory mapped at 0x202001031000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001032000 00:05:28.975 EAL: PCI memory mapped at 0x202001033000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001034000 00:05:28.975 EAL: PCI memory mapped at 0x202001035000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001036000 00:05:28.975 EAL: PCI memory mapped at 0x202001037000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001038000 00:05:28.975 EAL: PCI memory mapped at 0x202001039000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200103a000 00:05:28.975 EAL: PCI memory mapped at 0x20200103b000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200103c000 00:05:28.975 EAL: PCI memory mapped at 0x20200103d000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:28.975 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200103e000 00:05:28.975 EAL: PCI memory mapped at 0x20200103f000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:28.975 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001040000 00:05:28.975 EAL: PCI memory mapped at 0x202001041000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:28.975 EAL: Trying to obtain current memory policy. 00:05:28.975 EAL: Setting policy MPOL_PREFERRED for socket 1 00:05:28.975 EAL: Restoring previous memory policy: 4 00:05:28.975 EAL: request: mp_malloc_sync 00:05:28.975 EAL: No shared files mode enabled, IPC is disabled 00:05:28.975 EAL: Heap on socket 1 was expanded by 2MB 00:05:28.975 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001042000 00:05:28.975 EAL: PCI memory mapped at 0x202001043000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:28.975 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001044000 00:05:28.975 EAL: PCI memory mapped at 0x202001045000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:28.975 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001046000 00:05:28.975 EAL: PCI memory mapped at 0x202001047000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:28.975 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x202001048000 00:05:28.975 EAL: PCI memory mapped at 0x202001049000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:28.975 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200104a000 00:05:28.975 EAL: PCI memory mapped at 0x20200104b000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:28.975 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.975 EAL: PCI memory mapped at 0x20200104c000 00:05:28.975 EAL: PCI memory mapped at 0x20200104d000 00:05:28.975 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:28.975 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:05:28.975 EAL: probe driver: 8086:37c9 qat 00:05:28.976 EAL: PCI memory mapped at 0x20200104e000 00:05:28.976 EAL: PCI memory mapped at 0x20200104f000 00:05:28.976 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:28.976 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:05:28.976 EAL: probe driver: 8086:37c9 qat 00:05:28.976 EAL: PCI memory mapped at 0x202001050000 00:05:28.976 EAL: PCI memory mapped at 0x202001051000 00:05:28.976 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:28.976 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:05:28.976 EAL: probe driver: 8086:37c9 qat 00:05:28.976 EAL: PCI memory mapped at 0x202001052000 00:05:28.976 EAL: PCI memory mapped at 0x202001053000 00:05:28.976 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:29.543 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:05:29.543 EAL: probe driver: 8086:37c9 qat 00:05:29.543 EAL: PCI memory mapped at 0x202001054000 00:05:29.543 EAL: PCI memory mapped at 0x202001055000 00:05:29.543 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:29.543 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:05:29.543 EAL: probe driver: 8086:37c9 qat 00:05:29.543 EAL: PCI memory mapped at 0x202001056000 00:05:29.543 EAL: PCI memory mapped at 0x202001057000 00:05:29.543 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:29.543 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:05:29.543 EAL: probe driver: 8086:37c9 qat 00:05:29.543 EAL: PCI memory mapped at 0x202001058000 00:05:29.543 EAL: PCI memory mapped at 0x202001059000 00:05:29.543 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:29.543 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:05:29.543 EAL: probe driver: 8086:37c9 qat 00:05:29.543 EAL: PCI memory mapped at 0x20200105a000 00:05:29.543 EAL: PCI memory mapped at 0x20200105b000 00:05:29.543 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:29.543 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:05:29.543 EAL: probe driver: 8086:37c9 qat 00:05:29.543 EAL: PCI memory mapped at 0x20200105c000 00:05:29.543 EAL: PCI memory mapped at 0x20200105d000 00:05:29.543 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:29.543 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:05:29.543 EAL: probe driver: 8086:37c9 qat 00:05:29.543 EAL: PCI memory mapped at 0x20200105e000 00:05:29.543 EAL: PCI memory mapped at 0x20200105f000 00:05:29.543 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:29.543 EAL: No shared files mode enabled, IPC is disabled 00:05:29.543 EAL: No shared files mode enabled, IPC is disabled 00:05:29.543 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:29.543 EAL: Mem event callback 'spdk:(nil)' registered 00:05:29.543 00:05:29.543 00:05:29.543 CUnit - A unit testing framework for C - Version 2.1-3 00:05:29.543 http://cunit.sourceforge.net/ 00:05:29.543 00:05:29.543 00:05:29.543 Suite: components_suite 00:05:29.543 Test: vtophys_malloc_test ...passed 00:05:29.543 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:29.543 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.543 EAL: Restoring previous memory policy: 4 00:05:29.543 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.543 EAL: request: mp_malloc_sync 00:05:29.543 EAL: No shared files mode enabled, IPC is disabled 00:05:29.543 EAL: Heap on socket 0 was expanded by 4MB 00:05:29.543 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.543 EAL: request: mp_malloc_sync 00:05:29.543 EAL: No shared files mode enabled, IPC is disabled 00:05:29.543 EAL: Heap on socket 0 was shrunk by 4MB 00:05:29.543 EAL: Trying to obtain current memory policy. 00:05:29.543 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.543 EAL: Restoring previous memory policy: 4 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was expanded by 6MB 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was shrunk by 6MB 00:05:29.544 EAL: Trying to obtain current memory policy. 00:05:29.544 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.544 EAL: Restoring previous memory policy: 4 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was expanded by 10MB 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was shrunk by 10MB 00:05:29.544 EAL: Trying to obtain current memory policy. 00:05:29.544 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.544 EAL: Restoring previous memory policy: 4 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was expanded by 18MB 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was shrunk by 18MB 00:05:29.544 EAL: Trying to obtain current memory policy. 00:05:29.544 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.544 EAL: Restoring previous memory policy: 4 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was expanded by 34MB 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was shrunk by 34MB 00:05:29.544 EAL: Trying to obtain current memory policy. 00:05:29.544 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.544 EAL: Restoring previous memory policy: 4 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was expanded by 66MB 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was shrunk by 66MB 00:05:29.544 EAL: Trying to obtain current memory policy. 00:05:29.544 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.544 EAL: Restoring previous memory policy: 4 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was expanded by 130MB 00:05:29.544 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.544 EAL: request: mp_malloc_sync 00:05:29.544 EAL: No shared files mode enabled, IPC is disabled 00:05:29.544 EAL: Heap on socket 0 was shrunk by 130MB 00:05:29.544 EAL: Trying to obtain current memory policy. 00:05:29.544 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.802 EAL: Restoring previous memory policy: 4 00:05:29.802 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.802 EAL: request: mp_malloc_sync 00:05:29.802 EAL: No shared files mode enabled, IPC is disabled 00:05:29.802 EAL: Heap on socket 0 was expanded by 258MB 00:05:29.802 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.802 EAL: request: mp_malloc_sync 00:05:29.802 EAL: No shared files mode enabled, IPC is disabled 00:05:29.802 EAL: Heap on socket 0 was shrunk by 258MB 00:05:29.802 EAL: Trying to obtain current memory policy. 00:05:29.802 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.802 EAL: Restoring previous memory policy: 4 00:05:29.802 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.802 EAL: request: mp_malloc_sync 00:05:29.802 EAL: No shared files mode enabled, IPC is disabled 00:05:29.802 EAL: Heap on socket 0 was expanded by 514MB 00:05:30.060 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.060 EAL: request: mp_malloc_sync 00:05:30.060 EAL: No shared files mode enabled, IPC is disabled 00:05:30.060 EAL: Heap on socket 0 was shrunk by 514MB 00:05:30.060 EAL: Trying to obtain current memory policy. 00:05:30.060 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:30.318 EAL: Restoring previous memory policy: 4 00:05:30.318 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.318 EAL: request: mp_malloc_sync 00:05:30.318 EAL: No shared files mode enabled, IPC is disabled 00:05:30.318 EAL: Heap on socket 0 was expanded by 1026MB 00:05:30.576 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.576 EAL: request: mp_malloc_sync 00:05:30.576 EAL: No shared files mode enabled, IPC is disabled 00:05:30.576 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:30.576 passed 00:05:30.576 00:05:30.576 Run Summary: Type Total Ran Passed Failed Inactive 00:05:30.576 suites 1 1 n/a 0 0 00:05:30.576 tests 2 2 2 0 0 00:05:30.576 asserts 5932 5932 5932 0 n/a 00:05:30.576 00:05:30.576 Elapsed time = 1.126 seconds 00:05:30.576 EAL: No shared files mode enabled, IPC is disabled 00:05:30.576 EAL: No shared files mode enabled, IPC is disabled 00:05:30.576 EAL: No shared files mode enabled, IPC is disabled 00:05:30.576 00:05:30.576 real 0m1.858s 00:05:30.576 user 0m0.735s 00:05:30.576 sys 0m0.517s 00:05:30.576 13:28:18 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:30.576 13:28:18 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:30.576 ************************************ 00:05:30.576 END TEST env_vtophys 00:05:30.576 ************************************ 00:05:30.833 13:28:18 env -- common/autotest_common.sh@1142 -- # return 0 00:05:30.833 13:28:18 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.833 13:28:18 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:30.833 13:28:18 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.833 13:28:18 env -- common/autotest_common.sh@10 -- # set +x 00:05:30.833 ************************************ 00:05:30.833 START TEST env_pci 00:05:30.833 ************************************ 00:05:30.833 13:28:18 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.833 00:05:30.833 00:05:30.833 CUnit - A unit testing framework for C - Version 2.1-3 00:05:30.833 http://cunit.sourceforge.net/ 00:05:30.833 00:05:30.833 00:05:30.833 Suite: pci 00:05:30.833 Test: pci_hook ...[2024-07-15 13:28:18.258016] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 4128762 has claimed it 00:05:30.833 EAL: Cannot find device (10000:00:01.0) 00:05:30.833 EAL: Failed to attach device on primary process 00:05:30.833 passed 00:05:30.833 00:05:30.833 Run Summary: Type Total Ran Passed Failed Inactive 00:05:30.833 suites 1 1 n/a 0 0 00:05:30.833 tests 1 1 1 0 0 00:05:30.833 asserts 25 25 25 0 n/a 00:05:30.833 00:05:30.833 Elapsed time = 0.032 seconds 00:05:30.833 00:05:30.833 real 0m0.057s 00:05:30.833 user 0m0.014s 00:05:30.833 sys 0m0.043s 00:05:30.833 13:28:18 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:30.833 13:28:18 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:30.833 ************************************ 00:05:30.833 END TEST env_pci 00:05:30.833 ************************************ 00:05:30.833 13:28:18 env -- common/autotest_common.sh@1142 -- # return 0 00:05:30.833 13:28:18 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:30.833 13:28:18 env -- env/env.sh@15 -- # uname 00:05:30.833 13:28:18 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:30.833 13:28:18 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:30.833 13:28:18 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.833 13:28:18 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:05:30.833 13:28:18 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.833 13:28:18 env -- common/autotest_common.sh@10 -- # set +x 00:05:30.833 ************************************ 00:05:30.833 START TEST env_dpdk_post_init 00:05:30.833 ************************************ 00:05:30.833 13:28:18 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.833 EAL: Detected CPU lcores: 72 00:05:30.833 EAL: Detected NUMA nodes: 2 00:05:30.833 EAL: Detected shared linkage of DPDK 00:05:30.833 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:30.833 EAL: Selected IOVA mode 'PA' 00:05:30.833 EAL: VFIO support initialized 00:05:30.833 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:30.833 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:30.833 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.833 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.834 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.834 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:30.834 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:30.835 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:30.835 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:30.835 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:31.093 EAL: Using IOMMU type 1 (Type 1) 00:05:31.093 EAL: Ignore mapping IO port bar(1) 00:05:31.093 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:31.093 EAL: Ignore mapping IO port bar(1) 00:05:31.093 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:31.093 EAL: Ignore mapping IO port bar(1) 00:05:31.093 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:31.093 EAL: Ignore mapping IO port bar(1) 00:05:31.093 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:31.094 EAL: Ignore mapping IO port bar(1) 00:05:31.094 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:31.094 EAL: Ignore mapping IO port bar(1) 00:05:31.094 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:31.094 EAL: Ignore mapping IO port bar(1) 00:05:31.094 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:31.094 EAL: Ignore mapping IO port bar(1) 00:05:31.094 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:31.352 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:31.352 EAL: Ignore mapping IO port bar(1) 00:05:31.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:32.894 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:32.894 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:32.894 Starting DPDK initialization... 00:05:32.894 Starting SPDK post initialization... 00:05:32.894 SPDK NVMe probe 00:05:32.894 Attaching to 0000:5e:00.0 00:05:32.894 Attached to 0000:5e:00.0 00:05:32.894 Cleaning up... 00:05:32.894 00:05:32.894 real 0m2.035s 00:05:32.894 user 0m1.288s 00:05:32.894 sys 0m0.334s 00:05:32.895 13:28:20 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.895 13:28:20 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:32.895 ************************************ 00:05:32.895 END TEST env_dpdk_post_init 00:05:32.895 ************************************ 00:05:32.895 13:28:20 env -- common/autotest_common.sh@1142 -- # return 0 00:05:32.895 13:28:20 env -- env/env.sh@26 -- # uname 00:05:32.895 13:28:20 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:32.895 13:28:20 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:32.895 13:28:20 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:32.895 13:28:20 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:32.895 13:28:20 env -- common/autotest_common.sh@10 -- # set +x 00:05:32.895 ************************************ 00:05:32.895 START TEST env_mem_callbacks 00:05:32.895 ************************************ 00:05:32.895 13:28:20 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:33.153 EAL: Detected CPU lcores: 72 00:05:33.153 EAL: Detected NUMA nodes: 2 00:05:33.153 EAL: Detected shared linkage of DPDK 00:05:33.153 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:33.153 EAL: Selected IOVA mode 'PA' 00:05:33.153 EAL: VFIO support initialized 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:33.153 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:33.153 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:33.154 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:33.154 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:33.155 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:33.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:33.155 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:33.155 00:05:33.155 00:05:33.155 CUnit - A unit testing framework for C - Version 2.1-3 00:05:33.155 http://cunit.sourceforge.net/ 00:05:33.155 00:05:33.155 00:05:33.155 Suite: memory 00:05:33.155 Test: test ... 00:05:33.155 register 0x200000200000 2097152 00:05:33.155 register 0x201000a00000 2097152 00:05:33.155 malloc 3145728 00:05:33.155 register 0x200000400000 4194304 00:05:33.155 buf 0x200000500000 len 3145728 PASSED 00:05:33.155 malloc 64 00:05:33.155 buf 0x2000004fff40 len 64 PASSED 00:05:33.155 malloc 4194304 00:05:33.155 register 0x200000800000 6291456 00:05:33.155 buf 0x200000a00000 len 4194304 PASSED 00:05:33.155 free 0x200000500000 3145728 00:05:33.155 free 0x2000004fff40 64 00:05:33.155 unregister 0x200000400000 4194304 PASSED 00:05:33.155 free 0x200000a00000 4194304 00:05:33.155 unregister 0x200000800000 6291456 PASSED 00:05:33.155 malloc 8388608 00:05:33.155 register 0x200000400000 10485760 00:05:33.155 buf 0x200000600000 len 8388608 PASSED 00:05:33.155 free 0x200000600000 8388608 00:05:33.155 unregister 0x200000400000 10485760 PASSED 00:05:33.155 passed 00:05:33.155 00:05:33.155 Run Summary: Type Total Ran Passed Failed Inactive 00:05:33.155 suites 1 1 n/a 0 0 00:05:33.155 tests 1 1 1 0 0 00:05:33.155 asserts 16 16 16 0 n/a 00:05:33.155 00:05:33.155 Elapsed time = 0.007 seconds 00:05:33.155 00:05:33.155 real 0m0.089s 00:05:33.155 user 0m0.030s 00:05:33.155 sys 0m0.059s 00:05:33.155 13:28:20 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:33.155 13:28:20 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:33.155 ************************************ 00:05:33.155 END TEST env_mem_callbacks 00:05:33.155 ************************************ 00:05:33.155 13:28:20 env -- common/autotest_common.sh@1142 -- # return 0 00:05:33.155 00:05:33.155 real 0m4.662s 00:05:33.155 user 0m2.378s 00:05:33.155 sys 0m1.300s 00:05:33.155 13:28:20 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:33.155 13:28:20 env -- common/autotest_common.sh@10 -- # set +x 00:05:33.155 ************************************ 00:05:33.155 END TEST env 00:05:33.155 ************************************ 00:05:33.155 13:28:20 -- common/autotest_common.sh@1142 -- # return 0 00:05:33.155 13:28:20 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:33.155 13:28:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:33.155 13:28:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:33.155 13:28:20 -- common/autotest_common.sh@10 -- # set +x 00:05:33.155 ************************************ 00:05:33.155 START TEST rpc 00:05:33.155 ************************************ 00:05:33.155 13:28:20 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:33.412 * Looking for test storage... 00:05:33.412 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:33.412 13:28:20 rpc -- rpc/rpc.sh@65 -- # spdk_pid=4129184 00:05:33.412 13:28:20 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:33.412 13:28:20 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:33.412 13:28:20 rpc -- rpc/rpc.sh@67 -- # waitforlisten 4129184 00:05:33.412 13:28:20 rpc -- common/autotest_common.sh@829 -- # '[' -z 4129184 ']' 00:05:33.412 13:28:20 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:33.412 13:28:20 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:33.412 13:28:20 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:33.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:33.412 13:28:20 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:33.412 13:28:20 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:33.412 [2024-07-15 13:28:20.873560] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:05:33.412 [2024-07-15 13:28:20.873618] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4129184 ] 00:05:33.412 [2024-07-15 13:28:20.962243] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.669 [2024-07-15 13:28:21.051440] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:33.669 [2024-07-15 13:28:21.051480] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 4129184' to capture a snapshot of events at runtime. 00:05:33.669 [2024-07-15 13:28:21.051489] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:33.669 [2024-07-15 13:28:21.051498] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:33.669 [2024-07-15 13:28:21.051505] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid4129184 for offline analysis/debug. 00:05:33.669 [2024-07-15 13:28:21.051527] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.235 13:28:21 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:34.235 13:28:21 rpc -- common/autotest_common.sh@862 -- # return 0 00:05:34.235 13:28:21 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:34.235 13:28:21 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:34.235 13:28:21 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:34.235 13:28:21 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:34.235 13:28:21 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:34.235 13:28:21 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:34.235 13:28:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:34.235 ************************************ 00:05:34.235 START TEST rpc_integrity 00:05:34.235 ************************************ 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:34.235 { 00:05:34.235 "name": "Malloc0", 00:05:34.235 "aliases": [ 00:05:34.235 "9be0f4ea-8dca-4fff-85e0-d241490b8e28" 00:05:34.235 ], 00:05:34.235 "product_name": "Malloc disk", 00:05:34.235 "block_size": 512, 00:05:34.235 "num_blocks": 16384, 00:05:34.235 "uuid": "9be0f4ea-8dca-4fff-85e0-d241490b8e28", 00:05:34.235 "assigned_rate_limits": { 00:05:34.235 "rw_ios_per_sec": 0, 00:05:34.235 "rw_mbytes_per_sec": 0, 00:05:34.235 "r_mbytes_per_sec": 0, 00:05:34.235 "w_mbytes_per_sec": 0 00:05:34.235 }, 00:05:34.235 "claimed": false, 00:05:34.235 "zoned": false, 00:05:34.235 "supported_io_types": { 00:05:34.235 "read": true, 00:05:34.235 "write": true, 00:05:34.235 "unmap": true, 00:05:34.235 "flush": true, 00:05:34.235 "reset": true, 00:05:34.235 "nvme_admin": false, 00:05:34.235 "nvme_io": false, 00:05:34.235 "nvme_io_md": false, 00:05:34.235 "write_zeroes": true, 00:05:34.235 "zcopy": true, 00:05:34.235 "get_zone_info": false, 00:05:34.235 "zone_management": false, 00:05:34.235 "zone_append": false, 00:05:34.235 "compare": false, 00:05:34.235 "compare_and_write": false, 00:05:34.235 "abort": true, 00:05:34.235 "seek_hole": false, 00:05:34.235 "seek_data": false, 00:05:34.235 "copy": true, 00:05:34.235 "nvme_iov_md": false 00:05:34.235 }, 00:05:34.235 "memory_domains": [ 00:05:34.235 { 00:05:34.235 "dma_device_id": "system", 00:05:34.235 "dma_device_type": 1 00:05:34.235 }, 00:05:34.235 { 00:05:34.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.235 "dma_device_type": 2 00:05:34.235 } 00:05:34.235 ], 00:05:34.235 "driver_specific": {} 00:05:34.235 } 00:05:34.235 ]' 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.235 [2024-07-15 13:28:21.838722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:34.235 [2024-07-15 13:28:21.838757] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:34.235 [2024-07-15 13:28:21.838771] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14df520 00:05:34.235 [2024-07-15 13:28:21.838780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:34.235 [2024-07-15 13:28:21.839927] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:34.235 [2024-07-15 13:28:21.839951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:34.235 Passthru0 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.235 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.235 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:34.494 { 00:05:34.494 "name": "Malloc0", 00:05:34.494 "aliases": [ 00:05:34.494 "9be0f4ea-8dca-4fff-85e0-d241490b8e28" 00:05:34.494 ], 00:05:34.494 "product_name": "Malloc disk", 00:05:34.494 "block_size": 512, 00:05:34.494 "num_blocks": 16384, 00:05:34.494 "uuid": "9be0f4ea-8dca-4fff-85e0-d241490b8e28", 00:05:34.494 "assigned_rate_limits": { 00:05:34.494 "rw_ios_per_sec": 0, 00:05:34.494 "rw_mbytes_per_sec": 0, 00:05:34.494 "r_mbytes_per_sec": 0, 00:05:34.494 "w_mbytes_per_sec": 0 00:05:34.494 }, 00:05:34.494 "claimed": true, 00:05:34.494 "claim_type": "exclusive_write", 00:05:34.494 "zoned": false, 00:05:34.494 "supported_io_types": { 00:05:34.494 "read": true, 00:05:34.494 "write": true, 00:05:34.494 "unmap": true, 00:05:34.494 "flush": true, 00:05:34.494 "reset": true, 00:05:34.494 "nvme_admin": false, 00:05:34.494 "nvme_io": false, 00:05:34.494 "nvme_io_md": false, 00:05:34.494 "write_zeroes": true, 00:05:34.494 "zcopy": true, 00:05:34.494 "get_zone_info": false, 00:05:34.494 "zone_management": false, 00:05:34.494 "zone_append": false, 00:05:34.494 "compare": false, 00:05:34.494 "compare_and_write": false, 00:05:34.494 "abort": true, 00:05:34.494 "seek_hole": false, 00:05:34.494 "seek_data": false, 00:05:34.494 "copy": true, 00:05:34.494 "nvme_iov_md": false 00:05:34.494 }, 00:05:34.494 "memory_domains": [ 00:05:34.494 { 00:05:34.494 "dma_device_id": "system", 00:05:34.494 "dma_device_type": 1 00:05:34.494 }, 00:05:34.494 { 00:05:34.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.494 "dma_device_type": 2 00:05:34.494 } 00:05:34.494 ], 00:05:34.494 "driver_specific": {} 00:05:34.494 }, 00:05:34.494 { 00:05:34.494 "name": "Passthru0", 00:05:34.494 "aliases": [ 00:05:34.494 "b494613f-8d46-5491-955d-3500f288b322" 00:05:34.494 ], 00:05:34.494 "product_name": "passthru", 00:05:34.494 "block_size": 512, 00:05:34.494 "num_blocks": 16384, 00:05:34.494 "uuid": "b494613f-8d46-5491-955d-3500f288b322", 00:05:34.494 "assigned_rate_limits": { 00:05:34.494 "rw_ios_per_sec": 0, 00:05:34.494 "rw_mbytes_per_sec": 0, 00:05:34.494 "r_mbytes_per_sec": 0, 00:05:34.494 "w_mbytes_per_sec": 0 00:05:34.494 }, 00:05:34.494 "claimed": false, 00:05:34.494 "zoned": false, 00:05:34.494 "supported_io_types": { 00:05:34.494 "read": true, 00:05:34.494 "write": true, 00:05:34.494 "unmap": true, 00:05:34.494 "flush": true, 00:05:34.494 "reset": true, 00:05:34.494 "nvme_admin": false, 00:05:34.494 "nvme_io": false, 00:05:34.494 "nvme_io_md": false, 00:05:34.494 "write_zeroes": true, 00:05:34.494 "zcopy": true, 00:05:34.494 "get_zone_info": false, 00:05:34.494 "zone_management": false, 00:05:34.494 "zone_append": false, 00:05:34.494 "compare": false, 00:05:34.494 "compare_and_write": false, 00:05:34.494 "abort": true, 00:05:34.494 "seek_hole": false, 00:05:34.494 "seek_data": false, 00:05:34.494 "copy": true, 00:05:34.494 "nvme_iov_md": false 00:05:34.494 }, 00:05:34.494 "memory_domains": [ 00:05:34.494 { 00:05:34.494 "dma_device_id": "system", 00:05:34.494 "dma_device_type": 1 00:05:34.494 }, 00:05:34.494 { 00:05:34.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.494 "dma_device_type": 2 00:05:34.494 } 00:05:34.494 ], 00:05:34.494 "driver_specific": { 00:05:34.494 "passthru": { 00:05:34.494 "name": "Passthru0", 00:05:34.494 "base_bdev_name": "Malloc0" 00:05:34.494 } 00:05:34.494 } 00:05:34.494 } 00:05:34.494 ]' 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:34.494 13:28:21 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:34.494 00:05:34.494 real 0m0.270s 00:05:34.494 user 0m0.152s 00:05:34.494 sys 0m0.054s 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:34.494 13:28:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.494 ************************************ 00:05:34.494 END TEST rpc_integrity 00:05:34.495 ************************************ 00:05:34.495 13:28:22 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:34.495 13:28:22 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:34.495 13:28:22 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:34.495 13:28:22 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:34.495 13:28:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:34.495 ************************************ 00:05:34.495 START TEST rpc_plugins 00:05:34.495 ************************************ 00:05:34.495 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:05:34.495 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:34.495 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.495 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:34.495 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.495 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:34.495 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:34.495 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.495 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:34.495 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.495 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:34.495 { 00:05:34.495 "name": "Malloc1", 00:05:34.495 "aliases": [ 00:05:34.495 "1d8dfa06-b769-4a5f-a1a0-4d73af4eb08b" 00:05:34.495 ], 00:05:34.495 "product_name": "Malloc disk", 00:05:34.495 "block_size": 4096, 00:05:34.495 "num_blocks": 256, 00:05:34.495 "uuid": "1d8dfa06-b769-4a5f-a1a0-4d73af4eb08b", 00:05:34.495 "assigned_rate_limits": { 00:05:34.495 "rw_ios_per_sec": 0, 00:05:34.495 "rw_mbytes_per_sec": 0, 00:05:34.495 "r_mbytes_per_sec": 0, 00:05:34.495 "w_mbytes_per_sec": 0 00:05:34.495 }, 00:05:34.495 "claimed": false, 00:05:34.495 "zoned": false, 00:05:34.495 "supported_io_types": { 00:05:34.495 "read": true, 00:05:34.495 "write": true, 00:05:34.495 "unmap": true, 00:05:34.495 "flush": true, 00:05:34.495 "reset": true, 00:05:34.495 "nvme_admin": false, 00:05:34.495 "nvme_io": false, 00:05:34.495 "nvme_io_md": false, 00:05:34.495 "write_zeroes": true, 00:05:34.495 "zcopy": true, 00:05:34.495 "get_zone_info": false, 00:05:34.495 "zone_management": false, 00:05:34.495 "zone_append": false, 00:05:34.495 "compare": false, 00:05:34.495 "compare_and_write": false, 00:05:34.495 "abort": true, 00:05:34.495 "seek_hole": false, 00:05:34.495 "seek_data": false, 00:05:34.495 "copy": true, 00:05:34.495 "nvme_iov_md": false 00:05:34.495 }, 00:05:34.495 "memory_domains": [ 00:05:34.495 { 00:05:34.495 "dma_device_id": "system", 00:05:34.495 "dma_device_type": 1 00:05:34.495 }, 00:05:34.495 { 00:05:34.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.495 "dma_device_type": 2 00:05:34.495 } 00:05:34.495 ], 00:05:34.495 "driver_specific": {} 00:05:34.495 } 00:05:34.495 ]' 00:05:34.495 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:34.753 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:34.753 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.753 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.753 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:34.753 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:34.753 13:28:22 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:34.753 00:05:34.753 real 0m0.146s 00:05:34.753 user 0m0.084s 00:05:34.753 sys 0m0.026s 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:34.753 13:28:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:34.753 ************************************ 00:05:34.753 END TEST rpc_plugins 00:05:34.753 ************************************ 00:05:34.753 13:28:22 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:34.753 13:28:22 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:34.753 13:28:22 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:34.753 13:28:22 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:34.753 13:28:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:34.753 ************************************ 00:05:34.753 START TEST rpc_trace_cmd_test 00:05:34.753 ************************************ 00:05:34.753 13:28:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:05:34.753 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:34.753 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:34.753 13:28:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.753 13:28:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:34.753 13:28:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.753 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:34.753 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid4129184", 00:05:34.753 "tpoint_group_mask": "0x8", 00:05:34.753 "iscsi_conn": { 00:05:34.753 "mask": "0x2", 00:05:34.753 "tpoint_mask": "0x0" 00:05:34.753 }, 00:05:34.753 "scsi": { 00:05:34.753 "mask": "0x4", 00:05:34.753 "tpoint_mask": "0x0" 00:05:34.753 }, 00:05:34.753 "bdev": { 00:05:34.753 "mask": "0x8", 00:05:34.753 "tpoint_mask": "0xffffffffffffffff" 00:05:34.753 }, 00:05:34.753 "nvmf_rdma": { 00:05:34.753 "mask": "0x10", 00:05:34.753 "tpoint_mask": "0x0" 00:05:34.753 }, 00:05:34.753 "nvmf_tcp": { 00:05:34.754 "mask": "0x20", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "ftl": { 00:05:34.754 "mask": "0x40", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "blobfs": { 00:05:34.754 "mask": "0x80", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "dsa": { 00:05:34.754 "mask": "0x200", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "thread": { 00:05:34.754 "mask": "0x400", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "nvme_pcie": { 00:05:34.754 "mask": "0x800", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "iaa": { 00:05:34.754 "mask": "0x1000", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "nvme_tcp": { 00:05:34.754 "mask": "0x2000", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "bdev_nvme": { 00:05:34.754 "mask": "0x4000", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 }, 00:05:34.754 "sock": { 00:05:34.754 "mask": "0x8000", 00:05:34.754 "tpoint_mask": "0x0" 00:05:34.754 } 00:05:34.754 }' 00:05:34.754 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:34.754 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:34.754 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:34.754 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:34.754 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:35.012 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:35.012 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:35.012 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:35.012 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:35.012 13:28:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:35.012 00:05:35.012 real 0m0.180s 00:05:35.012 user 0m0.144s 00:05:35.012 sys 0m0.027s 00:05:35.012 13:28:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:35.012 13:28:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:35.012 ************************************ 00:05:35.012 END TEST rpc_trace_cmd_test 00:05:35.012 ************************************ 00:05:35.012 13:28:22 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:35.012 13:28:22 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:35.012 13:28:22 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:35.012 13:28:22 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:35.012 13:28:22 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:35.012 13:28:22 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.012 13:28:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.012 ************************************ 00:05:35.012 START TEST rpc_daemon_integrity 00:05:35.012 ************************************ 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:35.012 { 00:05:35.012 "name": "Malloc2", 00:05:35.012 "aliases": [ 00:05:35.012 "69fe93c1-271c-4028-8aea-ed0b6c771864" 00:05:35.012 ], 00:05:35.012 "product_name": "Malloc disk", 00:05:35.012 "block_size": 512, 00:05:35.012 "num_blocks": 16384, 00:05:35.012 "uuid": "69fe93c1-271c-4028-8aea-ed0b6c771864", 00:05:35.012 "assigned_rate_limits": { 00:05:35.012 "rw_ios_per_sec": 0, 00:05:35.012 "rw_mbytes_per_sec": 0, 00:05:35.012 "r_mbytes_per_sec": 0, 00:05:35.012 "w_mbytes_per_sec": 0 00:05:35.012 }, 00:05:35.012 "claimed": false, 00:05:35.012 "zoned": false, 00:05:35.012 "supported_io_types": { 00:05:35.012 "read": true, 00:05:35.012 "write": true, 00:05:35.012 "unmap": true, 00:05:35.012 "flush": true, 00:05:35.012 "reset": true, 00:05:35.012 "nvme_admin": false, 00:05:35.012 "nvme_io": false, 00:05:35.012 "nvme_io_md": false, 00:05:35.012 "write_zeroes": true, 00:05:35.012 "zcopy": true, 00:05:35.012 "get_zone_info": false, 00:05:35.012 "zone_management": false, 00:05:35.012 "zone_append": false, 00:05:35.012 "compare": false, 00:05:35.012 "compare_and_write": false, 00:05:35.012 "abort": true, 00:05:35.012 "seek_hole": false, 00:05:35.012 "seek_data": false, 00:05:35.012 "copy": true, 00:05:35.012 "nvme_iov_md": false 00:05:35.012 }, 00:05:35.012 "memory_domains": [ 00:05:35.012 { 00:05:35.012 "dma_device_id": "system", 00:05:35.012 "dma_device_type": 1 00:05:35.012 }, 00:05:35.012 { 00:05:35.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:35.012 "dma_device_type": 2 00:05:35.012 } 00:05:35.012 ], 00:05:35.012 "driver_specific": {} 00:05:35.012 } 00:05:35.012 ]' 00:05:35.012 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.271 [2024-07-15 13:28:22.644933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:35.271 [2024-07-15 13:28:22.644964] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:35.271 [2024-07-15 13:28:22.644979] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1688b80 00:05:35.271 [2024-07-15 13:28:22.644988] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:35.271 [2024-07-15 13:28:22.645975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:35.271 [2024-07-15 13:28:22.646007] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:35.271 Passthru0 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:35.271 { 00:05:35.271 "name": "Malloc2", 00:05:35.271 "aliases": [ 00:05:35.271 "69fe93c1-271c-4028-8aea-ed0b6c771864" 00:05:35.271 ], 00:05:35.271 "product_name": "Malloc disk", 00:05:35.271 "block_size": 512, 00:05:35.271 "num_blocks": 16384, 00:05:35.271 "uuid": "69fe93c1-271c-4028-8aea-ed0b6c771864", 00:05:35.271 "assigned_rate_limits": { 00:05:35.271 "rw_ios_per_sec": 0, 00:05:35.271 "rw_mbytes_per_sec": 0, 00:05:35.271 "r_mbytes_per_sec": 0, 00:05:35.271 "w_mbytes_per_sec": 0 00:05:35.271 }, 00:05:35.271 "claimed": true, 00:05:35.271 "claim_type": "exclusive_write", 00:05:35.271 "zoned": false, 00:05:35.271 "supported_io_types": { 00:05:35.271 "read": true, 00:05:35.271 "write": true, 00:05:35.271 "unmap": true, 00:05:35.271 "flush": true, 00:05:35.271 "reset": true, 00:05:35.271 "nvme_admin": false, 00:05:35.271 "nvme_io": false, 00:05:35.271 "nvme_io_md": false, 00:05:35.271 "write_zeroes": true, 00:05:35.271 "zcopy": true, 00:05:35.271 "get_zone_info": false, 00:05:35.271 "zone_management": false, 00:05:35.271 "zone_append": false, 00:05:35.271 "compare": false, 00:05:35.271 "compare_and_write": false, 00:05:35.271 "abort": true, 00:05:35.271 "seek_hole": false, 00:05:35.271 "seek_data": false, 00:05:35.271 "copy": true, 00:05:35.271 "nvme_iov_md": false 00:05:35.271 }, 00:05:35.271 "memory_domains": [ 00:05:35.271 { 00:05:35.271 "dma_device_id": "system", 00:05:35.271 "dma_device_type": 1 00:05:35.271 }, 00:05:35.271 { 00:05:35.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:35.271 "dma_device_type": 2 00:05:35.271 } 00:05:35.271 ], 00:05:35.271 "driver_specific": {} 00:05:35.271 }, 00:05:35.271 { 00:05:35.271 "name": "Passthru0", 00:05:35.271 "aliases": [ 00:05:35.271 "5fb674d4-a9d3-50d1-b332-8772cb06f97c" 00:05:35.271 ], 00:05:35.271 "product_name": "passthru", 00:05:35.271 "block_size": 512, 00:05:35.271 "num_blocks": 16384, 00:05:35.271 "uuid": "5fb674d4-a9d3-50d1-b332-8772cb06f97c", 00:05:35.271 "assigned_rate_limits": { 00:05:35.271 "rw_ios_per_sec": 0, 00:05:35.271 "rw_mbytes_per_sec": 0, 00:05:35.271 "r_mbytes_per_sec": 0, 00:05:35.271 "w_mbytes_per_sec": 0 00:05:35.271 }, 00:05:35.271 "claimed": false, 00:05:35.271 "zoned": false, 00:05:35.271 "supported_io_types": { 00:05:35.271 "read": true, 00:05:35.271 "write": true, 00:05:35.271 "unmap": true, 00:05:35.271 "flush": true, 00:05:35.271 "reset": true, 00:05:35.271 "nvme_admin": false, 00:05:35.271 "nvme_io": false, 00:05:35.271 "nvme_io_md": false, 00:05:35.271 "write_zeroes": true, 00:05:35.271 "zcopy": true, 00:05:35.271 "get_zone_info": false, 00:05:35.271 "zone_management": false, 00:05:35.271 "zone_append": false, 00:05:35.271 "compare": false, 00:05:35.271 "compare_and_write": false, 00:05:35.271 "abort": true, 00:05:35.271 "seek_hole": false, 00:05:35.271 "seek_data": false, 00:05:35.271 "copy": true, 00:05:35.271 "nvme_iov_md": false 00:05:35.271 }, 00:05:35.271 "memory_domains": [ 00:05:35.271 { 00:05:35.271 "dma_device_id": "system", 00:05:35.271 "dma_device_type": 1 00:05:35.271 }, 00:05:35.271 { 00:05:35.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:35.271 "dma_device_type": 2 00:05:35.271 } 00:05:35.271 ], 00:05:35.271 "driver_specific": { 00:05:35.271 "passthru": { 00:05:35.271 "name": "Passthru0", 00:05:35.271 "base_bdev_name": "Malloc2" 00:05:35.271 } 00:05:35.271 } 00:05:35.271 } 00:05:35.271 ]' 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.271 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:35.272 00:05:35.272 real 0m0.254s 00:05:35.272 user 0m0.149s 00:05:35.272 sys 0m0.047s 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:35.272 13:28:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.272 ************************************ 00:05:35.272 END TEST rpc_daemon_integrity 00:05:35.272 ************************************ 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:35.272 13:28:22 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:35.272 13:28:22 rpc -- rpc/rpc.sh@84 -- # killprocess 4129184 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@948 -- # '[' -z 4129184 ']' 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@952 -- # kill -0 4129184 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@953 -- # uname 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4129184 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4129184' 00:05:35.272 killing process with pid 4129184 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@967 -- # kill 4129184 00:05:35.272 13:28:22 rpc -- common/autotest_common.sh@972 -- # wait 4129184 00:05:35.838 00:05:35.838 real 0m2.512s 00:05:35.838 user 0m3.068s 00:05:35.838 sys 0m0.818s 00:05:35.838 13:28:23 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:35.838 13:28:23 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.838 ************************************ 00:05:35.838 END TEST rpc 00:05:35.838 ************************************ 00:05:35.838 13:28:23 -- common/autotest_common.sh@1142 -- # return 0 00:05:35.838 13:28:23 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:35.838 13:28:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:35.838 13:28:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.838 13:28:23 -- common/autotest_common.sh@10 -- # set +x 00:05:35.838 ************************************ 00:05:35.838 START TEST skip_rpc 00:05:35.838 ************************************ 00:05:35.838 13:28:23 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:35.838 * Looking for test storage... 00:05:35.838 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:35.838 13:28:23 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:35.838 13:28:23 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:35.838 13:28:23 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:35.838 13:28:23 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:35.838 13:28:23 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.838 13:28:23 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.838 ************************************ 00:05:35.838 START TEST skip_rpc 00:05:35.838 ************************************ 00:05:35.838 13:28:23 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:05:36.096 13:28:23 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=4129686 00:05:36.096 13:28:23 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:36.096 13:28:23 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:36.096 13:28:23 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:36.096 [2024-07-15 13:28:23.515823] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:05:36.096 [2024-07-15 13:28:23.515880] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4129686 ] 00:05:36.096 [2024-07-15 13:28:23.606186] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.096 [2024-07-15 13:28:23.695339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 4129686 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 4129686 ']' 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 4129686 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4129686 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4129686' 00:05:41.417 killing process with pid 4129686 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 4129686 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 4129686 00:05:41.417 00:05:41.417 real 0m5.424s 00:05:41.417 user 0m5.110s 00:05:41.417 sys 0m0.331s 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:41.417 13:28:28 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:41.417 ************************************ 00:05:41.417 END TEST skip_rpc 00:05:41.417 ************************************ 00:05:41.417 13:28:28 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:41.417 13:28:28 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:41.417 13:28:28 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:41.417 13:28:28 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:41.417 13:28:28 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:41.417 ************************************ 00:05:41.417 START TEST skip_rpc_with_json 00:05:41.417 ************************************ 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=4130477 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 4130477 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 4130477 ']' 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.417 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:41.417 13:28:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:41.417 [2024-07-15 13:28:29.020438] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:05:41.417 [2024-07-15 13:28:29.020494] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4130477 ] 00:05:41.676 [2024-07-15 13:28:29.105355] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.676 [2024-07-15 13:28:29.186875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.242 [2024-07-15 13:28:29.815325] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:42.242 request: 00:05:42.242 { 00:05:42.242 "trtype": "tcp", 00:05:42.242 "method": "nvmf_get_transports", 00:05:42.242 "req_id": 1 00:05:42.242 } 00:05:42.242 Got JSON-RPC error response 00:05:42.242 response: 00:05:42.242 { 00:05:42.242 "code": -19, 00:05:42.242 "message": "No such device" 00:05:42.242 } 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.242 [2024-07-15 13:28:29.823430] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:42.242 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.501 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:42.501 13:28:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:42.501 { 00:05:42.501 "subsystems": [ 00:05:42.501 { 00:05:42.501 "subsystem": "keyring", 00:05:42.501 "config": [] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "iobuf", 00:05:42.501 "config": [ 00:05:42.501 { 00:05:42.501 "method": "iobuf_set_options", 00:05:42.501 "params": { 00:05:42.501 "small_pool_count": 8192, 00:05:42.501 "large_pool_count": 1024, 00:05:42.501 "small_bufsize": 8192, 00:05:42.501 "large_bufsize": 135168 00:05:42.501 } 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "sock", 00:05:42.501 "config": [ 00:05:42.501 { 00:05:42.501 "method": "sock_set_default_impl", 00:05:42.501 "params": { 00:05:42.501 "impl_name": "posix" 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "sock_impl_set_options", 00:05:42.501 "params": { 00:05:42.501 "impl_name": "ssl", 00:05:42.501 "recv_buf_size": 4096, 00:05:42.501 "send_buf_size": 4096, 00:05:42.501 "enable_recv_pipe": true, 00:05:42.501 "enable_quickack": false, 00:05:42.501 "enable_placement_id": 0, 00:05:42.501 "enable_zerocopy_send_server": true, 00:05:42.501 "enable_zerocopy_send_client": false, 00:05:42.501 "zerocopy_threshold": 0, 00:05:42.501 "tls_version": 0, 00:05:42.501 "enable_ktls": false 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "sock_impl_set_options", 00:05:42.501 "params": { 00:05:42.501 "impl_name": "posix", 00:05:42.501 "recv_buf_size": 2097152, 00:05:42.501 "send_buf_size": 2097152, 00:05:42.501 "enable_recv_pipe": true, 00:05:42.501 "enable_quickack": false, 00:05:42.501 "enable_placement_id": 0, 00:05:42.501 "enable_zerocopy_send_server": true, 00:05:42.501 "enable_zerocopy_send_client": false, 00:05:42.501 "zerocopy_threshold": 0, 00:05:42.501 "tls_version": 0, 00:05:42.501 "enable_ktls": false 00:05:42.501 } 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "vmd", 00:05:42.501 "config": [] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "accel", 00:05:42.501 "config": [ 00:05:42.501 { 00:05:42.501 "method": "accel_set_options", 00:05:42.501 "params": { 00:05:42.501 "small_cache_size": 128, 00:05:42.501 "large_cache_size": 16, 00:05:42.501 "task_count": 2048, 00:05:42.501 "sequence_count": 2048, 00:05:42.501 "buf_count": 2048 00:05:42.501 } 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "bdev", 00:05:42.501 "config": [ 00:05:42.501 { 00:05:42.501 "method": "bdev_set_options", 00:05:42.501 "params": { 00:05:42.501 "bdev_io_pool_size": 65535, 00:05:42.501 "bdev_io_cache_size": 256, 00:05:42.501 "bdev_auto_examine": true, 00:05:42.501 "iobuf_small_cache_size": 128, 00:05:42.501 "iobuf_large_cache_size": 16 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "bdev_raid_set_options", 00:05:42.501 "params": { 00:05:42.501 "process_window_size_kb": 1024 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "bdev_iscsi_set_options", 00:05:42.501 "params": { 00:05:42.501 "timeout_sec": 30 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "bdev_nvme_set_options", 00:05:42.501 "params": { 00:05:42.501 "action_on_timeout": "none", 00:05:42.501 "timeout_us": 0, 00:05:42.501 "timeout_admin_us": 0, 00:05:42.501 "keep_alive_timeout_ms": 10000, 00:05:42.501 "arbitration_burst": 0, 00:05:42.501 "low_priority_weight": 0, 00:05:42.501 "medium_priority_weight": 0, 00:05:42.501 "high_priority_weight": 0, 00:05:42.501 "nvme_adminq_poll_period_us": 10000, 00:05:42.501 "nvme_ioq_poll_period_us": 0, 00:05:42.501 "io_queue_requests": 0, 00:05:42.501 "delay_cmd_submit": true, 00:05:42.501 "transport_retry_count": 4, 00:05:42.501 "bdev_retry_count": 3, 00:05:42.501 "transport_ack_timeout": 0, 00:05:42.501 "ctrlr_loss_timeout_sec": 0, 00:05:42.501 "reconnect_delay_sec": 0, 00:05:42.501 "fast_io_fail_timeout_sec": 0, 00:05:42.501 "disable_auto_failback": false, 00:05:42.501 "generate_uuids": false, 00:05:42.501 "transport_tos": 0, 00:05:42.501 "nvme_error_stat": false, 00:05:42.501 "rdma_srq_size": 0, 00:05:42.501 "io_path_stat": false, 00:05:42.501 "allow_accel_sequence": false, 00:05:42.501 "rdma_max_cq_size": 0, 00:05:42.501 "rdma_cm_event_timeout_ms": 0, 00:05:42.501 "dhchap_digests": [ 00:05:42.501 "sha256", 00:05:42.501 "sha384", 00:05:42.501 "sha512" 00:05:42.501 ], 00:05:42.501 "dhchap_dhgroups": [ 00:05:42.501 "null", 00:05:42.501 "ffdhe2048", 00:05:42.501 "ffdhe3072", 00:05:42.501 "ffdhe4096", 00:05:42.501 "ffdhe6144", 00:05:42.501 "ffdhe8192" 00:05:42.501 ] 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "bdev_nvme_set_hotplug", 00:05:42.501 "params": { 00:05:42.501 "period_us": 100000, 00:05:42.501 "enable": false 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "bdev_wait_for_examine" 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "scsi", 00:05:42.501 "config": null 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "scheduler", 00:05:42.501 "config": [ 00:05:42.501 { 00:05:42.501 "method": "framework_set_scheduler", 00:05:42.501 "params": { 00:05:42.501 "name": "static" 00:05:42.501 } 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "vhost_scsi", 00:05:42.501 "config": [] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "vhost_blk", 00:05:42.501 "config": [] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "ublk", 00:05:42.501 "config": [] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "nbd", 00:05:42.501 "config": [] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "nvmf", 00:05:42.501 "config": [ 00:05:42.501 { 00:05:42.501 "method": "nvmf_set_config", 00:05:42.501 "params": { 00:05:42.501 "discovery_filter": "match_any", 00:05:42.501 "admin_cmd_passthru": { 00:05:42.501 "identify_ctrlr": false 00:05:42.501 } 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "nvmf_set_max_subsystems", 00:05:42.501 "params": { 00:05:42.501 "max_subsystems": 1024 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "nvmf_set_crdt", 00:05:42.501 "params": { 00:05:42.501 "crdt1": 0, 00:05:42.501 "crdt2": 0, 00:05:42.501 "crdt3": 0 00:05:42.501 } 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "method": "nvmf_create_transport", 00:05:42.501 "params": { 00:05:42.501 "trtype": "TCP", 00:05:42.501 "max_queue_depth": 128, 00:05:42.501 "max_io_qpairs_per_ctrlr": 127, 00:05:42.501 "in_capsule_data_size": 4096, 00:05:42.501 "max_io_size": 131072, 00:05:42.501 "io_unit_size": 131072, 00:05:42.501 "max_aq_depth": 128, 00:05:42.501 "num_shared_buffers": 511, 00:05:42.501 "buf_cache_size": 4294967295, 00:05:42.501 "dif_insert_or_strip": false, 00:05:42.501 "zcopy": false, 00:05:42.501 "c2h_success": true, 00:05:42.501 "sock_priority": 0, 00:05:42.501 "abort_timeout_sec": 1, 00:05:42.501 "ack_timeout": 0, 00:05:42.501 "data_wr_pool_size": 0 00:05:42.501 } 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 }, 00:05:42.501 { 00:05:42.501 "subsystem": "iscsi", 00:05:42.501 "config": [ 00:05:42.501 { 00:05:42.501 "method": "iscsi_set_options", 00:05:42.501 "params": { 00:05:42.501 "node_base": "iqn.2016-06.io.spdk", 00:05:42.501 "max_sessions": 128, 00:05:42.501 "max_connections_per_session": 2, 00:05:42.501 "max_queue_depth": 64, 00:05:42.501 "default_time2wait": 2, 00:05:42.501 "default_time2retain": 20, 00:05:42.501 "first_burst_length": 8192, 00:05:42.501 "immediate_data": true, 00:05:42.501 "allow_duplicated_isid": false, 00:05:42.501 "error_recovery_level": 0, 00:05:42.501 "nop_timeout": 60, 00:05:42.501 "nop_in_interval": 30, 00:05:42.501 "disable_chap": false, 00:05:42.501 "require_chap": false, 00:05:42.501 "mutual_chap": false, 00:05:42.501 "chap_group": 0, 00:05:42.501 "max_large_datain_per_connection": 64, 00:05:42.501 "max_r2t_per_connection": 4, 00:05:42.501 "pdu_pool_size": 36864, 00:05:42.501 "immediate_data_pool_size": 16384, 00:05:42.501 "data_out_pool_size": 2048 00:05:42.501 } 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 } 00:05:42.501 ] 00:05:42.501 } 00:05:42.501 13:28:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:42.501 13:28:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 4130477 00:05:42.501 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 4130477 ']' 00:05:42.502 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 4130477 00:05:42.502 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:42.502 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:42.502 13:28:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4130477 00:05:42.502 13:28:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:42.502 13:28:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:42.502 13:28:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4130477' 00:05:42.502 killing process with pid 4130477 00:05:42.502 13:28:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 4130477 00:05:42.502 13:28:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 4130477 00:05:42.760 13:28:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=4130701 00:05:42.760 13:28:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:42.760 13:28:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 4130701 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 4130701 ']' 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 4130701 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4130701 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4130701' 00:05:48.023 killing process with pid 4130701 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 4130701 00:05:48.023 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 4130701 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:48.282 00:05:48.282 real 0m6.803s 00:05:48.282 user 0m6.479s 00:05:48.282 sys 0m0.711s 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:48.282 ************************************ 00:05:48.282 END TEST skip_rpc_with_json 00:05:48.282 ************************************ 00:05:48.282 13:28:35 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:48.282 13:28:35 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:48.282 13:28:35 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:48.282 13:28:35 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.282 13:28:35 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:48.282 ************************************ 00:05:48.282 START TEST skip_rpc_with_delay 00:05:48.282 ************************************ 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:48.282 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:48.540 [2024-07-15 13:28:35.904754] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:48.540 [2024-07-15 13:28:35.904826] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:48.540 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:05:48.540 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:48.540 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:48.540 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:48.540 00:05:48.540 real 0m0.077s 00:05:48.540 user 0m0.045s 00:05:48.540 sys 0m0.032s 00:05:48.540 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.540 13:28:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:48.540 ************************************ 00:05:48.540 END TEST skip_rpc_with_delay 00:05:48.540 ************************************ 00:05:48.540 13:28:35 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:48.540 13:28:35 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:48.540 13:28:35 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:48.540 13:28:35 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:48.540 13:28:35 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:48.540 13:28:35 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.540 13:28:35 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:48.540 ************************************ 00:05:48.540 START TEST exit_on_failed_rpc_init 00:05:48.540 ************************************ 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=4131466 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 4131466 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 4131466 ']' 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.540 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:48.540 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:48.540 [2024-07-15 13:28:36.063265] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:05:48.540 [2024-07-15 13:28:36.063322] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4131466 ] 00:05:48.540 [2024-07-15 13:28:36.150272] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.798 [2024-07-15 13:28:36.238389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:49.366 13:28:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.366 [2024-07-15 13:28:36.929342] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:05:49.366 [2024-07-15 13:28:36.929397] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4131507 ] 00:05:49.624 [2024-07-15 13:28:37.017747] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.624 [2024-07-15 13:28:37.107356] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:05:49.624 [2024-07-15 13:28:37.107429] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:49.624 [2024-07-15 13:28:37.107441] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:49.624 [2024-07-15 13:28:37.107450] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 4131466 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 4131466 ']' 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 4131466 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:49.624 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4131466 00:05:49.883 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:49.883 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:49.883 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4131466' 00:05:49.883 killing process with pid 4131466 00:05:49.883 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 4131466 00:05:49.883 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 4131466 00:05:50.142 00:05:50.142 real 0m1.611s 00:05:50.142 user 0m1.783s 00:05:50.142 sys 0m0.539s 00:05:50.142 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.142 13:28:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:50.142 ************************************ 00:05:50.142 END TEST exit_on_failed_rpc_init 00:05:50.142 ************************************ 00:05:50.142 13:28:37 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:50.142 13:28:37 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:50.142 00:05:50.142 real 0m14.366s 00:05:50.142 user 0m13.573s 00:05:50.142 sys 0m1.943s 00:05:50.142 13:28:37 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.142 13:28:37 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:50.142 ************************************ 00:05:50.142 END TEST skip_rpc 00:05:50.142 ************************************ 00:05:50.142 13:28:37 -- common/autotest_common.sh@1142 -- # return 0 00:05:50.142 13:28:37 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:50.142 13:28:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:50.142 13:28:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.142 13:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:50.142 ************************************ 00:05:50.142 START TEST rpc_client 00:05:50.142 ************************************ 00:05:50.142 13:28:37 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:50.401 * Looking for test storage... 00:05:50.401 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:50.401 13:28:37 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:50.401 OK 00:05:50.401 13:28:37 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:50.401 00:05:50.401 real 0m0.130s 00:05:50.401 user 0m0.047s 00:05:50.401 sys 0m0.090s 00:05:50.401 13:28:37 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.401 13:28:37 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:50.401 ************************************ 00:05:50.401 END TEST rpc_client 00:05:50.401 ************************************ 00:05:50.401 13:28:37 -- common/autotest_common.sh@1142 -- # return 0 00:05:50.401 13:28:37 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:50.401 13:28:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:50.401 13:28:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.401 13:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:50.401 ************************************ 00:05:50.401 START TEST json_config 00:05:50.401 ************************************ 00:05:50.401 13:28:37 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:50.659 13:28:38 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00d40ca9-2a78-e711-906e-0017a4403562 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=00d40ca9-2a78-e711-906e-0017a4403562 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:50.659 13:28:38 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:50.659 13:28:38 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:50.659 13:28:38 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:50.659 13:28:38 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:50.659 13:28:38 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.660 13:28:38 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.660 13:28:38 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.660 13:28:38 json_config -- paths/export.sh@5 -- # export PATH 00:05:50.660 13:28:38 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@47 -- # : 0 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:50.660 13:28:38 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:05:50.660 INFO: JSON configuration test init 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.660 13:28:38 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:05:50.660 13:28:38 json_config -- json_config/common.sh@9 -- # local app=target 00:05:50.660 13:28:38 json_config -- json_config/common.sh@10 -- # shift 00:05:50.660 13:28:38 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:50.660 13:28:38 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:50.660 13:28:38 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:50.660 13:28:38 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:50.660 13:28:38 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:50.660 13:28:38 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=4131767 00:05:50.660 13:28:38 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:50.660 Waiting for target to run... 00:05:50.660 13:28:38 json_config -- json_config/common.sh@25 -- # waitforlisten 4131767 /var/tmp/spdk_tgt.sock 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@829 -- # '[' -z 4131767 ']' 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:50.660 13:28:38 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:50.660 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:50.660 13:28:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.660 [2024-07-15 13:28:38.137529] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:05:50.660 [2024-07-15 13:28:38.137588] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4131767 ] 00:05:50.918 [2024-07-15 13:28:38.457862] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.918 [2024-07-15 13:28:38.535978] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.484 13:28:38 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:51.484 13:28:38 json_config -- common/autotest_common.sh@862 -- # return 0 00:05:51.484 13:28:38 json_config -- json_config/common.sh@26 -- # echo '' 00:05:51.484 00:05:51.484 13:28:38 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:05:51.484 13:28:38 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:05:51.484 13:28:38 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:51.484 13:28:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:51.484 13:28:38 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:05:51.484 13:28:38 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:51.484 13:28:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:51.742 13:28:39 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:51.742 13:28:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:51.742 [2024-07-15 13:28:39.270196] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:51.742 13:28:39 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:51.742 13:28:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:52.000 [2024-07-15 13:28:39.446636] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:52.000 13:28:39 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:05:52.000 13:28:39 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:52.000 13:28:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:52.000 13:28:39 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:52.000 13:28:39 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:05:52.000 13:28:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:52.258 [2024-07-15 13:28:39.686247] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:05:54.785 13:28:42 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:54.785 13:28:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:05:54.785 13:28:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@48 -- # local get_types 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:05:54.785 13:28:42 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:05:54.785 13:28:42 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:54.785 13:28:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@55 -- # return 0 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:05:55.043 13:28:42 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:55.043 13:28:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:55.043 13:28:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:05:55.043 13:28:42 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:05:55.043 13:28:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:05:55.301 Nvme0n1p0 Nvme0n1p1 00:05:55.301 13:28:42 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:05:55.301 13:28:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:05:55.600 [2024-07-15 13:28:42.972017] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:55.600 [2024-07-15 13:28:42.972060] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:55.600 00:05:55.600 13:28:42 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:05:55.600 13:28:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:05:55.600 Malloc3 00:05:55.600 13:28:43 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:55.600 13:28:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:55.858 [2024-07-15 13:28:43.304916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:55.858 [2024-07-15 13:28:43.304951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:55.858 [2024-07-15 13:28:43.304972] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c986d0 00:05:55.858 [2024-07-15 13:28:43.304980] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:55.858 [2024-07-15 13:28:43.306107] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:55.858 [2024-07-15 13:28:43.306130] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:55.858 PTBdevFromMalloc3 00:05:55.858 13:28:43 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:05:55.858 13:28:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:05:55.858 Null0 00:05:56.116 13:28:43 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:05:56.116 13:28:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:05:56.116 Malloc0 00:05:56.116 13:28:43 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:05:56.116 13:28:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:05:56.376 Malloc1 00:05:56.376 13:28:43 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:05:56.376 13:28:43 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:05:56.635 102400+0 records in 00:05:56.635 102400+0 records out 00:05:56.635 104857600 bytes (105 MB, 100 MiB) copied, 0.21657 s, 484 MB/s 00:05:56.635 13:28:44 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:05:56.635 13:28:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:05:56.635 aio_disk 00:05:56.635 13:28:44 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:05:56.635 13:28:44 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:56.635 13:28:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:56.891 9efcdc50-07a9-40ae-8b3e-1a5fb147a2ed 00:05:56.891 13:28:44 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:05:56.891 13:28:44 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:05:56.892 13:28:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:05:57.150 13:28:44 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:05:57.150 13:28:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:05:57.408 13:28:44 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:57.408 13:28:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:57.408 13:28:44 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:57.408 13:28:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:57.665 13:28:45 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:05:57.665 13:28:45 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:57.665 13:28:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:57.923 MallocForCryptoBdev 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@159 -- # wc -l 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:57.923 13:28:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:57.923 [2024-07-15 13:28:45.526114] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:05:57.923 CryptoMallocBdev 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:05:57.923 13:28:45 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:adcf9100-3980-45e8-a58d-a2f15c63b3f5 bdev_register:96b2c227-ca26-4393-b4bd-a56b57a75b43 bdev_register:907ef19a-c6ae-4c8e-a8af-e102377146b3 bdev_register:b3152e29-a6ce-4017-ad58-cf956f90fe52 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:adcf9100-3980-45e8-a58d-a2f15c63b3f5 bdev_register:96b2c227-ca26-4393-b4bd-a56b57a75b43 bdev_register:907ef19a-c6ae-4c8e-a8af-e102377146b3 bdev_register:b3152e29-a6ce-4017-ad58-cf956f90fe52 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@71 -- # sort 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@72 -- # sort 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:58.182 13:28:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:adcf9100-3980-45e8-a58d-a2f15c63b3f5 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:96b2c227-ca26-4393-b4bd-a56b57a75b43 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:907ef19a-c6ae-4c8e-a8af-e102377146b3 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:b3152e29-a6ce-4017-ad58-cf956f90fe52 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:907ef19a-c6ae-4c8e-a8af-e102377146b3 bdev_register:96b2c227-ca26-4393-b4bd-a56b57a75b43 bdev_register:adcf9100-3980-45e8-a58d-a2f15c63b3f5 bdev_register:aio_disk bdev_register:b3152e29-a6ce-4017-ad58-cf956f90fe52 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\0\7\e\f\1\9\a\-\c\6\a\e\-\4\c\8\e\-\a\8\a\f\-\e\1\0\2\3\7\7\1\4\6\b\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\6\b\2\c\2\2\7\-\c\a\2\6\-\4\3\9\3\-\b\4\b\d\-\a\5\6\b\5\7\a\7\5\b\4\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\d\c\f\9\1\0\0\-\3\9\8\0\-\4\5\e\8\-\a\5\8\d\-\a\2\f\1\5\c\6\3\b\3\f\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\3\1\5\2\e\2\9\-\a\6\c\e\-\4\0\1\7\-\a\d\5\8\-\c\f\9\5\6\f\9\0\f\e\5\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@86 -- # cat 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:907ef19a-c6ae-4c8e-a8af-e102377146b3 bdev_register:96b2c227-ca26-4393-b4bd-a56b57a75b43 bdev_register:adcf9100-3980-45e8-a58d-a2f15c63b3f5 bdev_register:aio_disk bdev_register:b3152e29-a6ce-4017-ad58-cf956f90fe52 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:05:58.182 Expected events matched: 00:05:58.182 bdev_register:907ef19a-c6ae-4c8e-a8af-e102377146b3 00:05:58.182 bdev_register:96b2c227-ca26-4393-b4bd-a56b57a75b43 00:05:58.182 bdev_register:adcf9100-3980-45e8-a58d-a2f15c63b3f5 00:05:58.182 bdev_register:aio_disk 00:05:58.182 bdev_register:b3152e29-a6ce-4017-ad58-cf956f90fe52 00:05:58.182 bdev_register:CryptoMallocBdev 00:05:58.182 bdev_register:Malloc0 00:05:58.182 bdev_register:Malloc0p0 00:05:58.182 bdev_register:Malloc0p1 00:05:58.182 bdev_register:Malloc0p2 00:05:58.182 bdev_register:Malloc1 00:05:58.182 bdev_register:Malloc3 00:05:58.182 bdev_register:MallocForCryptoBdev 00:05:58.182 bdev_register:Null0 00:05:58.182 bdev_register:Nvme0n1 00:05:58.182 bdev_register:Nvme0n1p0 00:05:58.182 bdev_register:Nvme0n1p1 00:05:58.182 bdev_register:PTBdevFromMalloc3 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:05:58.182 13:28:45 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:58.182 13:28:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:05:58.182 13:28:45 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:05:58.182 13:28:45 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:58.182 13:28:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:58.441 13:28:45 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:05:58.441 13:28:45 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:58.441 13:28:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:58.441 MallocBdevForConfigChangeCheck 00:05:58.441 13:28:46 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:05:58.441 13:28:46 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:58.441 13:28:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:58.441 13:28:46 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:05:58.441 13:28:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:59.007 13:28:46 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:05:59.007 INFO: shutting down applications... 00:05:59.007 13:28:46 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:05:59.007 13:28:46 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:05:59.007 13:28:46 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:05:59.007 13:28:46 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:05:59.007 [2024-07-15 13:28:46.516954] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:00.906 Calling clear_iscsi_subsystem 00:06:00.906 Calling clear_nvmf_subsystem 00:06:00.906 Calling clear_nbd_subsystem 00:06:00.906 Calling clear_ublk_subsystem 00:06:00.906 Calling clear_vhost_blk_subsystem 00:06:00.906 Calling clear_vhost_scsi_subsystem 00:06:00.906 Calling clear_bdev_subsystem 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@345 -- # break 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:00.906 13:28:48 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:00.906 13:28:48 json_config -- json_config/common.sh@31 -- # local app=target 00:06:00.906 13:28:48 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:00.906 13:28:48 json_config -- json_config/common.sh@35 -- # [[ -n 4131767 ]] 00:06:00.906 13:28:48 json_config -- json_config/common.sh@38 -- # kill -SIGINT 4131767 00:06:00.906 13:28:48 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:00.906 13:28:48 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:00.906 13:28:48 json_config -- json_config/common.sh@41 -- # kill -0 4131767 00:06:00.906 13:28:48 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:01.473 13:28:48 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:01.473 13:28:48 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:01.473 13:28:48 json_config -- json_config/common.sh@41 -- # kill -0 4131767 00:06:01.473 13:28:48 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:01.473 13:28:48 json_config -- json_config/common.sh@43 -- # break 00:06:01.473 13:28:48 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:01.473 13:28:48 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:01.473 SPDK target shutdown done 00:06:01.473 13:28:48 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:01.473 INFO: relaunching applications... 00:06:01.473 13:28:48 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:01.473 13:28:48 json_config -- json_config/common.sh@9 -- # local app=target 00:06:01.473 13:28:48 json_config -- json_config/common.sh@10 -- # shift 00:06:01.473 13:28:48 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:01.473 13:28:48 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:01.473 13:28:48 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:01.473 13:28:48 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:01.473 13:28:48 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:01.473 13:28:48 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=4133336 00:06:01.473 13:28:48 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:01.473 Waiting for target to run... 00:06:01.473 13:28:48 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:01.473 13:28:48 json_config -- json_config/common.sh@25 -- # waitforlisten 4133336 /var/tmp/spdk_tgt.sock 00:06:01.473 13:28:48 json_config -- common/autotest_common.sh@829 -- # '[' -z 4133336 ']' 00:06:01.473 13:28:48 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:01.473 13:28:48 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:01.473 13:28:48 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:01.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:01.473 13:28:48 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:01.473 13:28:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:01.473 [2024-07-15 13:28:48.987532] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:01.473 [2024-07-15 13:28:48.987592] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4133336 ] 00:06:02.040 [2024-07-15 13:28:49.520092] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.040 [2024-07-15 13:28:49.617563] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.297 [2024-07-15 13:28:49.670992] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:02.297 [2024-07-15 13:28:49.679029] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:02.297 [2024-07-15 13:28:49.687039] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:02.297 [2024-07-15 13:28:49.766457] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:04.824 [2024-07-15 13:28:51.948152] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:04.824 [2024-07-15 13:28:51.948209] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:04.824 [2024-07-15 13:28:51.948219] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:04.824 [2024-07-15 13:28:51.956171] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:04.824 [2024-07-15 13:28:51.956193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:04.824 [2024-07-15 13:28:51.964183] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:04.824 [2024-07-15 13:28:51.964200] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:04.824 [2024-07-15 13:28:51.972216] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:04.824 [2024-07-15 13:28:51.972248] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:04.824 [2024-07-15 13:28:51.972257] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:04.824 [2024-07-15 13:28:52.319030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:04.824 [2024-07-15 13:28:52.319066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:04.824 [2024-07-15 13:28:52.319078] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b53a0 00:06:04.824 [2024-07-15 13:28:52.319086] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:04.824 [2024-07-15 13:28:52.319292] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:04.824 [2024-07-15 13:28:52.319305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:04.824 13:28:52 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:04.824 13:28:52 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:04.824 13:28:52 json_config -- json_config/common.sh@26 -- # echo '' 00:06:04.824 00:06:04.824 13:28:52 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:04.824 13:28:52 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:04.824 INFO: Checking if target configuration is the same... 00:06:04.824 13:28:52 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:04.824 13:28:52 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:04.824 13:28:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:04.824 + '[' 2 -ne 2 ']' 00:06:04.824 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:04.824 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:04.824 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:04.824 +++ basename /dev/fd/62 00:06:05.081 ++ mktemp /tmp/62.XXX 00:06:05.081 + tmp_file_1=/tmp/62.jpI 00:06:05.081 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:05.081 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:05.081 + tmp_file_2=/tmp/spdk_tgt_config.json.Wnk 00:06:05.081 + ret=0 00:06:05.081 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:05.339 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:05.339 + diff -u /tmp/62.jpI /tmp/spdk_tgt_config.json.Wnk 00:06:05.339 + echo 'INFO: JSON config files are the same' 00:06:05.339 INFO: JSON config files are the same 00:06:05.339 + rm /tmp/62.jpI /tmp/spdk_tgt_config.json.Wnk 00:06:05.339 + exit 0 00:06:05.339 13:28:52 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:05.339 13:28:52 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:05.339 INFO: changing configuration and checking if this can be detected... 00:06:05.339 13:28:52 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:05.339 13:28:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:05.597 13:28:52 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:05.597 13:28:52 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:05.597 13:28:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:05.597 + '[' 2 -ne 2 ']' 00:06:05.597 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:05.597 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:05.597 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:05.597 +++ basename /dev/fd/62 00:06:05.597 ++ mktemp /tmp/62.XXX 00:06:05.597 + tmp_file_1=/tmp/62.tjr 00:06:05.597 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:05.597 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:05.597 + tmp_file_2=/tmp/spdk_tgt_config.json.VdT 00:06:05.597 + ret=0 00:06:05.597 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:05.853 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:05.853 + diff -u /tmp/62.tjr /tmp/spdk_tgt_config.json.VdT 00:06:05.853 + ret=1 00:06:05.853 + echo '=== Start of file: /tmp/62.tjr ===' 00:06:05.853 + cat /tmp/62.tjr 00:06:05.853 + echo '=== End of file: /tmp/62.tjr ===' 00:06:05.853 + echo '' 00:06:05.853 + echo '=== Start of file: /tmp/spdk_tgt_config.json.VdT ===' 00:06:05.853 + cat /tmp/spdk_tgt_config.json.VdT 00:06:05.853 + echo '=== End of file: /tmp/spdk_tgt_config.json.VdT ===' 00:06:05.853 + echo '' 00:06:05.853 + rm /tmp/62.tjr /tmp/spdk_tgt_config.json.VdT 00:06:05.853 + exit 1 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:05.853 INFO: configuration change detected. 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:05.853 13:28:53 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:05.853 13:28:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@317 -- # [[ -n 4133336 ]] 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:05.853 13:28:53 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:05.853 13:28:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:05.853 13:28:53 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:05.853 13:28:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:06.109 13:28:53 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:06.109 13:28:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:06.109 13:28:53 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:06.110 13:28:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:06.366 13:28:53 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:06.366 13:28:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:06.687 13:28:54 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:06.687 13:28:54 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:06.687 13:28:54 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:06.687 13:28:54 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:06.687 13:28:54 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:06.687 13:28:54 json_config -- json_config/json_config.sh@323 -- # killprocess 4133336 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@948 -- # '[' -z 4133336 ']' 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@952 -- # kill -0 4133336 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@953 -- # uname 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4133336 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4133336' 00:06:06.687 killing process with pid 4133336 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@967 -- # kill 4133336 00:06:06.687 13:28:54 json_config -- common/autotest_common.sh@972 -- # wait 4133336 00:06:08.614 13:28:55 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:08.614 13:28:55 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:08.614 13:28:55 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:08.614 13:28:55 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:08.614 13:28:56 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:08.614 13:28:56 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:08.614 INFO: Success 00:06:08.614 00:06:08.614 real 0m18.063s 00:06:08.614 user 0m21.906s 00:06:08.614 sys 0m3.334s 00:06:08.614 13:28:56 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:08.614 13:28:56 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:08.615 ************************************ 00:06:08.615 END TEST json_config 00:06:08.615 ************************************ 00:06:08.615 13:28:56 -- common/autotest_common.sh@1142 -- # return 0 00:06:08.615 13:28:56 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:08.615 13:28:56 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:08.615 13:28:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.615 13:28:56 -- common/autotest_common.sh@10 -- # set +x 00:06:08.615 ************************************ 00:06:08.615 START TEST json_config_extra_key 00:06:08.615 ************************************ 00:06:08.615 13:28:56 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00d40ca9-2a78-e711-906e-0017a4403562 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=00d40ca9-2a78-e711-906e-0017a4403562 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:08.615 13:28:56 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:08.615 13:28:56 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:08.615 13:28:56 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:08.615 13:28:56 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:08.615 13:28:56 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:08.615 13:28:56 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:08.615 13:28:56 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:08.615 13:28:56 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:08.615 13:28:56 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:08.615 INFO: launching applications... 00:06:08.615 13:28:56 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=4134513 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:08.615 Waiting for target to run... 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 4134513 /var/tmp/spdk_tgt.sock 00:06:08.615 13:28:56 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 4134513 ']' 00:06:08.615 13:28:56 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:08.615 13:28:56 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:08.615 13:28:56 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:08.615 13:28:56 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:08.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:08.615 13:28:56 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:08.615 13:28:56 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:08.873 [2024-07-15 13:28:56.270046] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:08.874 [2024-07-15 13:28:56.270108] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4134513 ] 00:06:09.441 [2024-07-15 13:28:56.819981] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.441 [2024-07-15 13:28:56.916320] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.700 13:28:57 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:09.700 13:28:57 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:09.700 00:06:09.700 13:28:57 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:09.700 INFO: shutting down applications... 00:06:09.700 13:28:57 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 4134513 ]] 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 4134513 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 4134513 00:06:09.700 13:28:57 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:09.958 13:28:57 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:09.958 13:28:57 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:09.958 13:28:57 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 4134513 00:06:09.958 13:28:57 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:09.958 13:28:57 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:09.958 13:28:57 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:09.958 13:28:57 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:09.958 SPDK target shutdown done 00:06:09.958 13:28:57 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:09.958 Success 00:06:09.958 00:06:09.958 real 0m1.475s 00:06:09.958 user 0m0.807s 00:06:09.958 sys 0m0.683s 00:06:10.217 13:28:57 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:10.217 13:28:57 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:10.217 ************************************ 00:06:10.217 END TEST json_config_extra_key 00:06:10.217 ************************************ 00:06:10.217 13:28:57 -- common/autotest_common.sh@1142 -- # return 0 00:06:10.217 13:28:57 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:10.217 13:28:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:10.217 13:28:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.217 13:28:57 -- common/autotest_common.sh@10 -- # set +x 00:06:10.217 ************************************ 00:06:10.217 START TEST alias_rpc 00:06:10.217 ************************************ 00:06:10.217 13:28:57 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:10.217 * Looking for test storage... 00:06:10.217 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:10.217 13:28:57 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:10.217 13:28:57 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=4134740 00:06:10.217 13:28:57 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.217 13:28:57 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 4134740 00:06:10.217 13:28:57 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 4134740 ']' 00:06:10.217 13:28:57 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.217 13:28:57 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:10.217 13:28:57 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.217 13:28:57 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:10.217 13:28:57 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:10.217 [2024-07-15 13:28:57.794337] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:10.217 [2024-07-15 13:28:57.794404] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4134740 ] 00:06:10.475 [2024-07-15 13:28:57.880891] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.475 [2024-07-15 13:28:57.962529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.041 13:28:58 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:11.041 13:28:58 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:06:11.041 13:28:58 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:11.299 13:28:58 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 4134740 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 4134740 ']' 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 4134740 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4134740 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4134740' 00:06:11.299 killing process with pid 4134740 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@967 -- # kill 4134740 00:06:11.299 13:28:58 alias_rpc -- common/autotest_common.sh@972 -- # wait 4134740 00:06:11.865 00:06:11.865 real 0m1.542s 00:06:11.865 user 0m1.589s 00:06:11.865 sys 0m0.494s 00:06:11.865 13:28:59 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:11.865 13:28:59 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:11.865 ************************************ 00:06:11.865 END TEST alias_rpc 00:06:11.865 ************************************ 00:06:11.865 13:28:59 -- common/autotest_common.sh@1142 -- # return 0 00:06:11.865 13:28:59 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:11.865 13:28:59 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:11.865 13:28:59 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:11.865 13:28:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.865 13:28:59 -- common/autotest_common.sh@10 -- # set +x 00:06:11.865 ************************************ 00:06:11.865 START TEST spdkcli_tcp 00:06:11.865 ************************************ 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:11.865 * Looking for test storage... 00:06:11.865 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=4134983 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:11.865 13:28:59 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 4134983 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 4134983 ']' 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:11.865 13:28:59 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:11.865 [2024-07-15 13:28:59.434161] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:11.865 [2024-07-15 13:28:59.434221] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4134983 ] 00:06:12.123 [2024-07-15 13:28:59.520488] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:12.123 [2024-07-15 13:28:59.610180] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:12.123 [2024-07-15 13:28:59.610183] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.689 13:29:00 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:12.689 13:29:00 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:06:12.689 13:29:00 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=4135147 00:06:12.689 13:29:00 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:12.689 13:29:00 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:12.948 [ 00:06:12.948 "bdev_malloc_delete", 00:06:12.948 "bdev_malloc_create", 00:06:12.948 "bdev_null_resize", 00:06:12.948 "bdev_null_delete", 00:06:12.948 "bdev_null_create", 00:06:12.948 "bdev_nvme_cuse_unregister", 00:06:12.948 "bdev_nvme_cuse_register", 00:06:12.948 "bdev_opal_new_user", 00:06:12.948 "bdev_opal_set_lock_state", 00:06:12.948 "bdev_opal_delete", 00:06:12.948 "bdev_opal_get_info", 00:06:12.948 "bdev_opal_create", 00:06:12.948 "bdev_nvme_opal_revert", 00:06:12.948 "bdev_nvme_opal_init", 00:06:12.948 "bdev_nvme_send_cmd", 00:06:12.948 "bdev_nvme_get_path_iostat", 00:06:12.948 "bdev_nvme_get_mdns_discovery_info", 00:06:12.948 "bdev_nvme_stop_mdns_discovery", 00:06:12.948 "bdev_nvme_start_mdns_discovery", 00:06:12.948 "bdev_nvme_set_multipath_policy", 00:06:12.948 "bdev_nvme_set_preferred_path", 00:06:12.948 "bdev_nvme_get_io_paths", 00:06:12.948 "bdev_nvme_remove_error_injection", 00:06:12.948 "bdev_nvme_add_error_injection", 00:06:12.948 "bdev_nvme_get_discovery_info", 00:06:12.948 "bdev_nvme_stop_discovery", 00:06:12.948 "bdev_nvme_start_discovery", 00:06:12.948 "bdev_nvme_get_controller_health_info", 00:06:12.948 "bdev_nvme_disable_controller", 00:06:12.948 "bdev_nvme_enable_controller", 00:06:12.948 "bdev_nvme_reset_controller", 00:06:12.948 "bdev_nvme_get_transport_statistics", 00:06:12.948 "bdev_nvme_apply_firmware", 00:06:12.948 "bdev_nvme_detach_controller", 00:06:12.948 "bdev_nvme_get_controllers", 00:06:12.948 "bdev_nvme_attach_controller", 00:06:12.948 "bdev_nvme_set_hotplug", 00:06:12.948 "bdev_nvme_set_options", 00:06:12.948 "bdev_passthru_delete", 00:06:12.948 "bdev_passthru_create", 00:06:12.948 "bdev_lvol_set_parent_bdev", 00:06:12.948 "bdev_lvol_set_parent", 00:06:12.948 "bdev_lvol_check_shallow_copy", 00:06:12.948 "bdev_lvol_start_shallow_copy", 00:06:12.948 "bdev_lvol_grow_lvstore", 00:06:12.948 "bdev_lvol_get_lvols", 00:06:12.948 "bdev_lvol_get_lvstores", 00:06:12.948 "bdev_lvol_delete", 00:06:12.948 "bdev_lvol_set_read_only", 00:06:12.948 "bdev_lvol_resize", 00:06:12.948 "bdev_lvol_decouple_parent", 00:06:12.948 "bdev_lvol_inflate", 00:06:12.948 "bdev_lvol_rename", 00:06:12.948 "bdev_lvol_clone_bdev", 00:06:12.948 "bdev_lvol_clone", 00:06:12.948 "bdev_lvol_snapshot", 00:06:12.948 "bdev_lvol_create", 00:06:12.948 "bdev_lvol_delete_lvstore", 00:06:12.948 "bdev_lvol_rename_lvstore", 00:06:12.948 "bdev_lvol_create_lvstore", 00:06:12.948 "bdev_raid_set_options", 00:06:12.948 "bdev_raid_remove_base_bdev", 00:06:12.948 "bdev_raid_add_base_bdev", 00:06:12.948 "bdev_raid_delete", 00:06:12.948 "bdev_raid_create", 00:06:12.948 "bdev_raid_get_bdevs", 00:06:12.948 "bdev_error_inject_error", 00:06:12.948 "bdev_error_delete", 00:06:12.948 "bdev_error_create", 00:06:12.948 "bdev_split_delete", 00:06:12.948 "bdev_split_create", 00:06:12.948 "bdev_delay_delete", 00:06:12.948 "bdev_delay_create", 00:06:12.948 "bdev_delay_update_latency", 00:06:12.948 "bdev_zone_block_delete", 00:06:12.948 "bdev_zone_block_create", 00:06:12.948 "blobfs_create", 00:06:12.948 "blobfs_detect", 00:06:12.948 "blobfs_set_cache_size", 00:06:12.948 "bdev_crypto_delete", 00:06:12.948 "bdev_crypto_create", 00:06:12.948 "bdev_compress_delete", 00:06:12.948 "bdev_compress_create", 00:06:12.948 "bdev_compress_get_orphans", 00:06:12.948 "bdev_aio_delete", 00:06:12.948 "bdev_aio_rescan", 00:06:12.948 "bdev_aio_create", 00:06:12.948 "bdev_ftl_set_property", 00:06:12.948 "bdev_ftl_get_properties", 00:06:12.948 "bdev_ftl_get_stats", 00:06:12.948 "bdev_ftl_unmap", 00:06:12.948 "bdev_ftl_unload", 00:06:12.948 "bdev_ftl_delete", 00:06:12.948 "bdev_ftl_load", 00:06:12.948 "bdev_ftl_create", 00:06:12.948 "bdev_virtio_attach_controller", 00:06:12.948 "bdev_virtio_scsi_get_devices", 00:06:12.948 "bdev_virtio_detach_controller", 00:06:12.948 "bdev_virtio_blk_set_hotplug", 00:06:12.948 "bdev_iscsi_delete", 00:06:12.948 "bdev_iscsi_create", 00:06:12.948 "bdev_iscsi_set_options", 00:06:12.948 "accel_error_inject_error", 00:06:12.948 "ioat_scan_accel_module", 00:06:12.948 "dsa_scan_accel_module", 00:06:12.948 "iaa_scan_accel_module", 00:06:12.948 "dpdk_cryptodev_get_driver", 00:06:12.948 "dpdk_cryptodev_set_driver", 00:06:12.948 "dpdk_cryptodev_scan_accel_module", 00:06:12.948 "compressdev_scan_accel_module", 00:06:12.948 "keyring_file_remove_key", 00:06:12.948 "keyring_file_add_key", 00:06:12.948 "keyring_linux_set_options", 00:06:12.948 "iscsi_get_histogram", 00:06:12.948 "iscsi_enable_histogram", 00:06:12.948 "iscsi_set_options", 00:06:12.948 "iscsi_get_auth_groups", 00:06:12.948 "iscsi_auth_group_remove_secret", 00:06:12.948 "iscsi_auth_group_add_secret", 00:06:12.948 "iscsi_delete_auth_group", 00:06:12.948 "iscsi_create_auth_group", 00:06:12.948 "iscsi_set_discovery_auth", 00:06:12.948 "iscsi_get_options", 00:06:12.948 "iscsi_target_node_request_logout", 00:06:12.948 "iscsi_target_node_set_redirect", 00:06:12.948 "iscsi_target_node_set_auth", 00:06:12.948 "iscsi_target_node_add_lun", 00:06:12.948 "iscsi_get_stats", 00:06:12.948 "iscsi_get_connections", 00:06:12.948 "iscsi_portal_group_set_auth", 00:06:12.948 "iscsi_start_portal_group", 00:06:12.948 "iscsi_delete_portal_group", 00:06:12.948 "iscsi_create_portal_group", 00:06:12.948 "iscsi_get_portal_groups", 00:06:12.948 "iscsi_delete_target_node", 00:06:12.948 "iscsi_target_node_remove_pg_ig_maps", 00:06:12.948 "iscsi_target_node_add_pg_ig_maps", 00:06:12.948 "iscsi_create_target_node", 00:06:12.948 "iscsi_get_target_nodes", 00:06:12.948 "iscsi_delete_initiator_group", 00:06:12.948 "iscsi_initiator_group_remove_initiators", 00:06:12.948 "iscsi_initiator_group_add_initiators", 00:06:12.948 "iscsi_create_initiator_group", 00:06:12.948 "iscsi_get_initiator_groups", 00:06:12.948 "nvmf_set_crdt", 00:06:12.948 "nvmf_set_config", 00:06:12.948 "nvmf_set_max_subsystems", 00:06:12.948 "nvmf_stop_mdns_prr", 00:06:12.948 "nvmf_publish_mdns_prr", 00:06:12.948 "nvmf_subsystem_get_listeners", 00:06:12.948 "nvmf_subsystem_get_qpairs", 00:06:12.948 "nvmf_subsystem_get_controllers", 00:06:12.948 "nvmf_get_stats", 00:06:12.948 "nvmf_get_transports", 00:06:12.948 "nvmf_create_transport", 00:06:12.948 "nvmf_get_targets", 00:06:12.948 "nvmf_delete_target", 00:06:12.948 "nvmf_create_target", 00:06:12.948 "nvmf_subsystem_allow_any_host", 00:06:12.948 "nvmf_subsystem_remove_host", 00:06:12.948 "nvmf_subsystem_add_host", 00:06:12.948 "nvmf_ns_remove_host", 00:06:12.948 "nvmf_ns_add_host", 00:06:12.948 "nvmf_subsystem_remove_ns", 00:06:12.948 "nvmf_subsystem_add_ns", 00:06:12.948 "nvmf_subsystem_listener_set_ana_state", 00:06:12.948 "nvmf_discovery_get_referrals", 00:06:12.948 "nvmf_discovery_remove_referral", 00:06:12.948 "nvmf_discovery_add_referral", 00:06:12.948 "nvmf_subsystem_remove_listener", 00:06:12.948 "nvmf_subsystem_add_listener", 00:06:12.948 "nvmf_delete_subsystem", 00:06:12.948 "nvmf_create_subsystem", 00:06:12.948 "nvmf_get_subsystems", 00:06:12.948 "env_dpdk_get_mem_stats", 00:06:12.948 "nbd_get_disks", 00:06:12.948 "nbd_stop_disk", 00:06:12.948 "nbd_start_disk", 00:06:12.948 "ublk_recover_disk", 00:06:12.948 "ublk_get_disks", 00:06:12.948 "ublk_stop_disk", 00:06:12.948 "ublk_start_disk", 00:06:12.948 "ublk_destroy_target", 00:06:12.948 "ublk_create_target", 00:06:12.949 "virtio_blk_create_transport", 00:06:12.949 "virtio_blk_get_transports", 00:06:12.949 "vhost_controller_set_coalescing", 00:06:12.949 "vhost_get_controllers", 00:06:12.949 "vhost_delete_controller", 00:06:12.949 "vhost_create_blk_controller", 00:06:12.949 "vhost_scsi_controller_remove_target", 00:06:12.949 "vhost_scsi_controller_add_target", 00:06:12.949 "vhost_start_scsi_controller", 00:06:12.949 "vhost_create_scsi_controller", 00:06:12.949 "thread_set_cpumask", 00:06:12.949 "framework_get_governor", 00:06:12.949 "framework_get_scheduler", 00:06:12.949 "framework_set_scheduler", 00:06:12.949 "framework_get_reactors", 00:06:12.949 "thread_get_io_channels", 00:06:12.949 "thread_get_pollers", 00:06:12.949 "thread_get_stats", 00:06:12.949 "framework_monitor_context_switch", 00:06:12.949 "spdk_kill_instance", 00:06:12.949 "log_enable_timestamps", 00:06:12.949 "log_get_flags", 00:06:12.949 "log_clear_flag", 00:06:12.949 "log_set_flag", 00:06:12.949 "log_get_level", 00:06:12.949 "log_set_level", 00:06:12.949 "log_get_print_level", 00:06:12.949 "log_set_print_level", 00:06:12.949 "framework_enable_cpumask_locks", 00:06:12.949 "framework_disable_cpumask_locks", 00:06:12.949 "framework_wait_init", 00:06:12.949 "framework_start_init", 00:06:12.949 "scsi_get_devices", 00:06:12.949 "bdev_get_histogram", 00:06:12.949 "bdev_enable_histogram", 00:06:12.949 "bdev_set_qos_limit", 00:06:12.949 "bdev_set_qd_sampling_period", 00:06:12.949 "bdev_get_bdevs", 00:06:12.949 "bdev_reset_iostat", 00:06:12.949 "bdev_get_iostat", 00:06:12.949 "bdev_examine", 00:06:12.949 "bdev_wait_for_examine", 00:06:12.949 "bdev_set_options", 00:06:12.949 "notify_get_notifications", 00:06:12.949 "notify_get_types", 00:06:12.949 "accel_get_stats", 00:06:12.949 "accel_set_options", 00:06:12.949 "accel_set_driver", 00:06:12.949 "accel_crypto_key_destroy", 00:06:12.949 "accel_crypto_keys_get", 00:06:12.949 "accel_crypto_key_create", 00:06:12.949 "accel_assign_opc", 00:06:12.949 "accel_get_module_info", 00:06:12.949 "accel_get_opc_assignments", 00:06:12.949 "vmd_rescan", 00:06:12.949 "vmd_remove_device", 00:06:12.949 "vmd_enable", 00:06:12.949 "sock_get_default_impl", 00:06:12.949 "sock_set_default_impl", 00:06:12.949 "sock_impl_set_options", 00:06:12.949 "sock_impl_get_options", 00:06:12.949 "iobuf_get_stats", 00:06:12.949 "iobuf_set_options", 00:06:12.949 "framework_get_pci_devices", 00:06:12.949 "framework_get_config", 00:06:12.949 "framework_get_subsystems", 00:06:12.949 "trace_get_info", 00:06:12.949 "trace_get_tpoint_group_mask", 00:06:12.949 "trace_disable_tpoint_group", 00:06:12.949 "trace_enable_tpoint_group", 00:06:12.949 "trace_clear_tpoint_mask", 00:06:12.949 "trace_set_tpoint_mask", 00:06:12.949 "keyring_get_keys", 00:06:12.949 "spdk_get_version", 00:06:12.949 "rpc_get_methods" 00:06:12.949 ] 00:06:12.949 13:29:00 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:12.949 13:29:00 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:12.949 13:29:00 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 4134983 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 4134983 ']' 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 4134983 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4134983 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4134983' 00:06:12.949 killing process with pid 4134983 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 4134983 00:06:12.949 13:29:00 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 4134983 00:06:13.516 00:06:13.516 real 0m1.561s 00:06:13.516 user 0m2.762s 00:06:13.516 sys 0m0.515s 00:06:13.516 13:29:00 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:13.516 13:29:00 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:13.516 ************************************ 00:06:13.516 END TEST spdkcli_tcp 00:06:13.516 ************************************ 00:06:13.516 13:29:00 -- common/autotest_common.sh@1142 -- # return 0 00:06:13.516 13:29:00 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:13.516 13:29:00 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:13.516 13:29:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:13.516 13:29:00 -- common/autotest_common.sh@10 -- # set +x 00:06:13.516 ************************************ 00:06:13.516 START TEST dpdk_mem_utility 00:06:13.516 ************************************ 00:06:13.516 13:29:00 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:13.516 * Looking for test storage... 00:06:13.516 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:13.516 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:13.516 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=4135268 00:06:13.517 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 4135268 00:06:13.517 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 4135268 ']' 00:06:13.517 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.517 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:13.517 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.517 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:13.517 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:13.517 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:13.517 [2024-07-15 13:29:01.074191] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:13.517 [2024-07-15 13:29:01.074251] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4135268 ] 00:06:13.775 [2024-07-15 13:29:01.163490] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.775 [2024-07-15 13:29:01.249019] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.342 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:14.342 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:06:14.342 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:14.342 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:14.342 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:14.342 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:14.342 { 00:06:14.342 "filename": "/tmp/spdk_mem_dump.txt" 00:06:14.342 } 00:06:14.342 13:29:01 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:14.342 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:14.604 DPDK memory size 816.000000 MiB in 2 heap(s) 00:06:14.604 2 heaps totaling size 816.000000 MiB 00:06:14.604 size: 814.000000 MiB heap id: 0 00:06:14.604 size: 2.000000 MiB heap id: 1 00:06:14.604 end heaps---------- 00:06:14.604 8 mempools totaling size 598.116089 MiB 00:06:14.604 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:14.604 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:14.604 size: 84.521057 MiB name: bdev_io_4135268 00:06:14.604 size: 51.011292 MiB name: evtpool_4135268 00:06:14.604 size: 50.003479 MiB name: msgpool_4135268 00:06:14.604 size: 21.763794 MiB name: PDU_Pool 00:06:14.604 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:14.604 size: 0.026123 MiB name: Session_Pool 00:06:14.604 end mempools------- 00:06:14.604 201 memzones totaling size 4.176453 MiB 00:06:14.604 size: 1.000366 MiB name: RG_ring_0_4135268 00:06:14.604 size: 1.000366 MiB name: RG_ring_1_4135268 00:06:14.604 size: 1.000366 MiB name: RG_ring_4_4135268 00:06:14.604 size: 1.000366 MiB name: RG_ring_5_4135268 00:06:14.604 size: 0.125366 MiB name: RG_ring_2_4135268 00:06:14.604 size: 0.015991 MiB name: RG_ring_3_4135268 00:06:14.604 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:14.604 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:14.604 size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:14.605 size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:14.605 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:14.605 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:14.605 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:14.605 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:14.605 end memzones------- 00:06:14.605 13:29:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:14.605 heap id: 0 total size: 814.000000 MiB number of busy elements: 537 number of free elements: 14 00:06:14.605 list of free elements. size: 11.811523 MiB 00:06:14.605 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:14.605 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:14.605 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:14.605 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:14.605 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:14.605 element at address: 0x200013800000 with size: 0.978882 MiB 00:06:14.605 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:14.605 element at address: 0x200019200000 with size: 0.937256 MiB 00:06:14.605 element at address: 0x20001aa00000 with size: 0.580505 MiB 00:06:14.605 element at address: 0x200003a00000 with size: 0.498535 MiB 00:06:14.605 element at address: 0x20000b200000 with size: 0.491272 MiB 00:06:14.605 element at address: 0x200000800000 with size: 0.486511 MiB 00:06:14.605 element at address: 0x200019400000 with size: 0.485840 MiB 00:06:14.605 element at address: 0x200027e00000 with size: 0.402710 MiB 00:06:14.605 list of standard malloc elements. size: 199.880188 MiB 00:06:14.606 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:14.606 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:14.606 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:14.606 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:14.606 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:14.606 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:14.606 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:14.606 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:14.606 element at address: 0x200000330b40 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000337640 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000033e140 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000344c40 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000034b740 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000352240 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000358d40 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000035f840 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:14.606 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:14.606 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:14.606 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000333040 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000335540 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000339b40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000033c040 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000340640 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000342b40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000347140 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000349640 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000350140 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000354740 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000356c40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000035b240 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000035d740 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:14.606 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:14.606 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:14.606 element at address: 0x200000204f80 with size: 0.000305 MiB 00:06:14.606 element at address: 0x200000200000 with size: 0.000183 MiB 00:06:14.606 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:06:14.606 element at address: 0x200000200180 with size: 0.000183 MiB 00:06:14.606 element at address: 0x200000200240 with size: 0.000183 MiB 00:06:14.606 element at address: 0x200000200300 with size: 0.000183 MiB 00:06:14.606 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200480 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200540 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200600 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200780 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200840 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200900 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200a80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200b40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200c00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200d80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200e40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200f00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201080 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201140 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201200 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201380 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201440 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201500 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201680 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201740 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201800 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201980 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201a40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201b00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201c80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201d40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201e00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000201f80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202040 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202100 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202280 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202340 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202400 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202580 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202640 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202700 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202880 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202940 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202a00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202b80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202c40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202d00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202e80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000202f40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203000 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203180 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203240 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203300 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203480 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203540 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203600 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203780 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203840 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203900 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203a80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203b40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203c00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203d80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203e40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203f00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204080 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204140 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204200 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204380 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204440 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204500 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204680 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204740 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204800 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204980 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204a40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204b00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204c80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204d40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204e00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000204ec0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205180 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205240 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205300 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205480 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205540 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205600 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205780 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205840 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205900 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205a80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205b40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205c00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205d80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205e40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205f00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000206080 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000206140 with size: 0.000183 MiB 00:06:14.607 element at address: 0x200000206200 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000020a780 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022af80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b040 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b100 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b280 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b340 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b400 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b580 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b640 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b700 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b900 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022be40 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022c080 with size: 0.000183 MiB 00:06:14.607 element at address: 0x20000022c140 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000022c200 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000022c380 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000022c440 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000022c500 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000032e700 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000331d40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000338840 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000033f340 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000345e40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000034c940 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000353440 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000359f40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000360a40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:14.608 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:06:14.608 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:14.609 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e67180 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e67240 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6de40 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:14.609 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:14.609 list of memzone associated elements. size: 602.308289 MiB 00:06:14.609 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:14.609 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:14.609 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:14.609 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:14.609 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:14.609 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_4135268_0 00:06:14.609 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:14.609 associated memzone info: size: 48.002930 MiB name: MP_evtpool_4135268_0 00:06:14.609 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:14.609 associated memzone info: size: 48.002930 MiB name: MP_msgpool_4135268_0 00:06:14.609 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:14.609 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:14.609 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:14.609 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:14.609 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:14.609 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_4135268 00:06:14.609 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:14.609 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_4135268 00:06:14.609 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:06:14.609 associated memzone info: size: 1.007996 MiB name: MP_evtpool_4135268 00:06:14.609 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:14.609 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:14.609 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:14.609 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:14.609 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:14.609 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:14.609 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:14.609 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:14.609 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:14.609 associated memzone info: size: 1.000366 MiB name: RG_ring_0_4135268 00:06:14.609 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:14.609 associated memzone info: size: 1.000366 MiB name: RG_ring_1_4135268 00:06:14.609 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:14.609 associated memzone info: size: 1.000366 MiB name: RG_ring_4_4135268 00:06:14.609 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:14.609 associated memzone info: size: 1.000366 MiB name: RG_ring_5_4135268 00:06:14.609 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:14.609 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_4135268 00:06:14.609 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:06:14.609 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:14.609 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:14.609 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:14.609 element at address: 0x20001947c600 with size: 0.250488 MiB 00:06:14.609 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:14.609 element at address: 0x20000020a840 with size: 0.125488 MiB 00:06:14.609 associated memzone info: size: 0.125366 MiB name: RG_ring_2_4135268 00:06:14.609 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:14.609 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:14.609 element at address: 0x200027e67300 with size: 0.023743 MiB 00:06:14.609 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:14.609 element at address: 0x200000206580 with size: 0.016113 MiB 00:06:14.609 associated memzone info: size: 0.015991 MiB name: RG_ring_3_4135268 00:06:14.609 element at address: 0x200027e6d440 with size: 0.002441 MiB 00:06:14.609 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:14.609 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:14.609 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:14.609 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:14.609 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:14.609 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:14.610 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:14.610 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:14.610 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:14.610 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:14.610 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:14.610 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:14.610 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:14.610 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:14.610 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:14.610 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:14.610 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:14.610 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:14.610 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:14.610 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:14.610 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:14.610 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:14.610 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:14.610 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:14.610 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:14.610 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:14.610 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:14.610 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:14.610 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:14.610 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:14.610 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:14.610 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:14.610 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:14.610 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:14.610 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:14.610 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:14.610 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:14.610 element at address: 0x20000035d580 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:14.610 element at address: 0x20000035a000 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:14.610 element at address: 0x200000356a80 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:14.610 element at address: 0x200000353500 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:14.610 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:14.610 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:14.610 element at address: 0x200000349480 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:14.610 element at address: 0x200000345f00 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:14.610 element at address: 0x200000342980 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:14.610 element at address: 0x20000033f400 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:14.610 element at address: 0x20000033be80 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:14.610 element at address: 0x200000338900 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:14.610 element at address: 0x200000335380 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:14.610 element at address: 0x200000331e00 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:14.610 element at address: 0x20000032e880 with size: 0.000427 MiB 00:06:14.610 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:14.610 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:14.610 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:14.610 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:06:14.610 associated memzone info: size: 0.000183 MiB name: MP_msgpool_4135268 00:06:14.610 element at address: 0x200000206380 with size: 0.000305 MiB 00:06:14.610 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_4135268 00:06:14.610 element at address: 0x200027e6df00 with size: 0.000305 MiB 00:06:14.610 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:14.610 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:14.610 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:14.610 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:14.610 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:14.610 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:14.610 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:14.610 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:14.610 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:14.610 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:14.610 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:14.610 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:14.610 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:14.610 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:14.610 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:14.610 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:14.610 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:14.610 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:14.610 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:14.610 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:14.610 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:14.610 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:14.610 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:14.610 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:14.610 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:14.610 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:14.610 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:14.610 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:14.610 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:14.610 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:14.610 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:14.610 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:14.610 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:14.610 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:14.611 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:14.611 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:14.611 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:14.611 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:14.611 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:14.611 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:14.611 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:14.611 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:14.611 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:14.611 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:14.611 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:14.611 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:14.611 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:14.611 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:14.611 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:14.611 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:14.611 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:14.611 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:14.611 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:14.611 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:14.611 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:14.611 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:14.611 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:14.611 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:14.611 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:14.611 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:14.611 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:14.611 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:14.611 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:14.611 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:14.611 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:14.611 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:14.611 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:14.611 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:14.611 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:14.611 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:14.611 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:14.611 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:14.611 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:14.611 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:14.611 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:14.611 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:14.611 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:14.611 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:14.611 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:14.611 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:14.611 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:14.611 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:14.611 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:14.611 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:14.611 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:14.611 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:14.611 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:14.611 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:14.611 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:14.611 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:14.611 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:14.611 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:14.611 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:14.611 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:14.611 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:14.611 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:14.611 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:14.611 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:14.611 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:14.611 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:14.611 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:14.611 13:29:02 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:14.611 13:29:02 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 4135268 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 4135268 ']' 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 4135268 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4135268 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4135268' 00:06:14.611 killing process with pid 4135268 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 4135268 00:06:14.611 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 4135268 00:06:14.871 00:06:14.871 real 0m1.525s 00:06:14.871 user 0m1.550s 00:06:14.871 sys 0m0.500s 00:06:14.871 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:14.871 13:29:02 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:14.871 ************************************ 00:06:14.871 END TEST dpdk_mem_utility 00:06:14.871 ************************************ 00:06:14.871 13:29:02 -- common/autotest_common.sh@1142 -- # return 0 00:06:14.871 13:29:02 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:14.871 13:29:02 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.871 13:29:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.871 13:29:02 -- common/autotest_common.sh@10 -- # set +x 00:06:15.130 ************************************ 00:06:15.130 START TEST event 00:06:15.130 ************************************ 00:06:15.130 13:29:02 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:15.130 * Looking for test storage... 00:06:15.130 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:15.130 13:29:02 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:15.130 13:29:02 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:15.130 13:29:02 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:15.130 13:29:02 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:15.130 13:29:02 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.130 13:29:02 event -- common/autotest_common.sh@10 -- # set +x 00:06:15.130 ************************************ 00:06:15.130 START TEST event_perf 00:06:15.130 ************************************ 00:06:15.130 13:29:02 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:15.130 Running I/O for 1 seconds...[2024-07-15 13:29:02.692413] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:15.130 [2024-07-15 13:29:02.692481] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4135624 ] 00:06:15.389 [2024-07-15 13:29:02.778448] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:15.389 [2024-07-15 13:29:02.866173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:15.389 [2024-07-15 13:29:02.866263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:15.389 [2024-07-15 13:29:02.866342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:15.389 [2024-07-15 13:29:02.866343] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.763 Running I/O for 1 seconds... 00:06:16.763 lcore 0: 215017 00:06:16.763 lcore 1: 215018 00:06:16.763 lcore 2: 215020 00:06:16.763 lcore 3: 215017 00:06:16.763 done. 00:06:16.763 00:06:16.763 real 0m1.276s 00:06:16.763 user 0m4.160s 00:06:16.763 sys 0m0.109s 00:06:16.763 13:29:03 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.763 13:29:03 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:16.763 ************************************ 00:06:16.763 END TEST event_perf 00:06:16.763 ************************************ 00:06:16.763 13:29:03 event -- common/autotest_common.sh@1142 -- # return 0 00:06:16.763 13:29:03 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:16.763 13:29:03 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:16.763 13:29:03 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.763 13:29:03 event -- common/autotest_common.sh@10 -- # set +x 00:06:16.763 ************************************ 00:06:16.763 START TEST event_reactor 00:06:16.763 ************************************ 00:06:16.763 13:29:04 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:16.763 [2024-07-15 13:29:04.064200] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:16.763 [2024-07-15 13:29:04.064279] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4135820 ] 00:06:16.763 [2024-07-15 13:29:04.155714] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.763 [2024-07-15 13:29:04.241720] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.136 test_start 00:06:18.136 oneshot 00:06:18.136 tick 100 00:06:18.136 tick 100 00:06:18.136 tick 250 00:06:18.136 tick 100 00:06:18.136 tick 100 00:06:18.136 tick 250 00:06:18.136 tick 100 00:06:18.136 tick 500 00:06:18.136 tick 100 00:06:18.136 tick 100 00:06:18.136 tick 250 00:06:18.136 tick 100 00:06:18.136 tick 100 00:06:18.136 test_end 00:06:18.136 00:06:18.136 real 0m1.293s 00:06:18.136 user 0m1.172s 00:06:18.136 sys 0m0.116s 00:06:18.136 13:29:05 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:18.136 13:29:05 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:18.136 ************************************ 00:06:18.136 END TEST event_reactor 00:06:18.136 ************************************ 00:06:18.136 13:29:05 event -- common/autotest_common.sh@1142 -- # return 0 00:06:18.136 13:29:05 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:18.136 13:29:05 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:18.136 13:29:05 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:18.136 13:29:05 event -- common/autotest_common.sh@10 -- # set +x 00:06:18.136 ************************************ 00:06:18.136 START TEST event_reactor_perf 00:06:18.136 ************************************ 00:06:18.136 13:29:05 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:18.136 [2024-07-15 13:29:05.427628] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:18.136 [2024-07-15 13:29:05.427690] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4136018 ] 00:06:18.136 [2024-07-15 13:29:05.515357] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.136 [2024-07-15 13:29:05.598128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.071 test_start 00:06:19.071 test_end 00:06:19.071 Performance: 517682 events per second 00:06:19.071 00:06:19.071 real 0m1.275s 00:06:19.071 user 0m1.164s 00:06:19.071 sys 0m0.106s 00:06:19.071 13:29:06 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.071 13:29:06 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:19.071 ************************************ 00:06:19.071 END TEST event_reactor_perf 00:06:19.071 ************************************ 00:06:19.330 13:29:06 event -- common/autotest_common.sh@1142 -- # return 0 00:06:19.330 13:29:06 event -- event/event.sh@49 -- # uname -s 00:06:19.330 13:29:06 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:19.330 13:29:06 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:19.330 13:29:06 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.330 13:29:06 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.330 13:29:06 event -- common/autotest_common.sh@10 -- # set +x 00:06:19.330 ************************************ 00:06:19.330 START TEST event_scheduler 00:06:19.330 ************************************ 00:06:19.330 13:29:06 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:19.330 * Looking for test storage... 00:06:19.330 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:19.330 13:29:06 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:19.330 13:29:06 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=4136247 00:06:19.330 13:29:06 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:19.330 13:29:06 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:19.330 13:29:06 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 4136247 00:06:19.330 13:29:06 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 4136247 ']' 00:06:19.330 13:29:06 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.330 13:29:06 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.330 13:29:06 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.330 13:29:06 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.330 13:29:06 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:19.330 [2024-07-15 13:29:06.923457] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:19.330 [2024-07-15 13:29:06.923516] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4136247 ] 00:06:19.588 [2024-07-15 13:29:07.006189] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:19.588 [2024-07-15 13:29:07.090189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.588 [2024-07-15 13:29:07.090267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:19.589 [2024-07-15 13:29:07.090343] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:19.589 [2024-07-15 13:29:07.090344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:20.155 13:29:07 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:20.155 13:29:07 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:06:20.155 13:29:07 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:20.155 13:29:07 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.155 13:29:07 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:20.155 [2024-07-15 13:29:07.728675] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:20.155 [2024-07-15 13:29:07.728700] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:06:20.155 [2024-07-15 13:29:07.728711] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:20.155 [2024-07-15 13:29:07.728719] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:20.155 [2024-07-15 13:29:07.728727] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:20.155 13:29:07 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.155 13:29:07 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:20.155 13:29:07 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.155 13:29:07 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 [2024-07-15 13:29:07.813580] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:20.424 13:29:07 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:20.424 13:29:07 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:20.424 13:29:07 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 ************************************ 00:06:20.424 START TEST scheduler_create_thread 00:06:20.424 ************************************ 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 2 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 3 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 4 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 5 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 6 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 7 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 8 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 9 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 10 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.424 13:29:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:21.796 13:29:09 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.796 13:29:09 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:21.796 13:29:09 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:21.796 13:29:09 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.796 13:29:09 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:23.165 13:29:10 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:23.165 00:06:23.165 real 0m2.619s 00:06:23.165 user 0m0.024s 00:06:23.165 sys 0m0.007s 00:06:23.165 13:29:10 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:23.165 13:29:10 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:23.165 ************************************ 00:06:23.165 END TEST scheduler_create_thread 00:06:23.165 ************************************ 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:06:23.165 13:29:10 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:23.165 13:29:10 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 4136247 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 4136247 ']' 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 4136247 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4136247 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4136247' 00:06:23.165 killing process with pid 4136247 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 4136247 00:06:23.165 13:29:10 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 4136247 00:06:23.422 [2024-07-15 13:29:10.956146] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:23.680 00:06:23.680 real 0m4.423s 00:06:23.680 user 0m8.116s 00:06:23.680 sys 0m0.469s 00:06:23.680 13:29:11 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:23.680 13:29:11 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:23.680 ************************************ 00:06:23.680 END TEST event_scheduler 00:06:23.680 ************************************ 00:06:23.680 13:29:11 event -- common/autotest_common.sh@1142 -- # return 0 00:06:23.680 13:29:11 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:23.680 13:29:11 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:23.680 13:29:11 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:23.680 13:29:11 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.680 13:29:11 event -- common/autotest_common.sh@10 -- # set +x 00:06:23.680 ************************************ 00:06:23.680 START TEST app_repeat 00:06:23.680 ************************************ 00:06:23.680 13:29:11 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@19 -- # repeat_pid=4136832 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 4136832' 00:06:23.680 Process app_repeat pid: 4136832 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:23.680 spdk_app_start Round 0 00:06:23.680 13:29:11 event.app_repeat -- event/event.sh@25 -- # waitforlisten 4136832 /var/tmp/spdk-nbd.sock 00:06:23.680 13:29:11 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4136832 ']' 00:06:23.680 13:29:11 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:23.680 13:29:11 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:23.680 13:29:11 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:23.680 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:23.680 13:29:11 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:23.680 13:29:11 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:23.938 [2024-07-15 13:29:11.325012] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:23.938 [2024-07-15 13:29:11.325079] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4136832 ] 00:06:23.938 [2024-07-15 13:29:11.411701] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:23.938 [2024-07-15 13:29:11.502106] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.938 [2024-07-15 13:29:11.502108] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.872 13:29:12 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.872 13:29:12 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:24.872 13:29:12 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:24.872 Malloc0 00:06:24.872 13:29:12 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:25.130 Malloc1 00:06:25.130 13:29:12 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:25.130 /dev/nbd0 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:25.130 1+0 records in 00:06:25.130 1+0 records out 00:06:25.130 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212744 s, 19.3 MB/s 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:25.130 13:29:12 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:25.130 13:29:12 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:25.387 /dev/nbd1 00:06:25.388 13:29:12 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:25.388 13:29:12 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:25.388 1+0 records in 00:06:25.388 1+0 records out 00:06:25.388 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276141 s, 14.8 MB/s 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:25.388 13:29:12 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:25.388 13:29:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:25.388 13:29:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:25.388 13:29:12 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:25.388 13:29:12 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.388 13:29:12 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:25.645 { 00:06:25.645 "nbd_device": "/dev/nbd0", 00:06:25.645 "bdev_name": "Malloc0" 00:06:25.645 }, 00:06:25.645 { 00:06:25.645 "nbd_device": "/dev/nbd1", 00:06:25.645 "bdev_name": "Malloc1" 00:06:25.645 } 00:06:25.645 ]' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:25.645 { 00:06:25.645 "nbd_device": "/dev/nbd0", 00:06:25.645 "bdev_name": "Malloc0" 00:06:25.645 }, 00:06:25.645 { 00:06:25.645 "nbd_device": "/dev/nbd1", 00:06:25.645 "bdev_name": "Malloc1" 00:06:25.645 } 00:06:25.645 ]' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:25.645 /dev/nbd1' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:25.645 /dev/nbd1' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:25.645 256+0 records in 00:06:25.645 256+0 records out 00:06:25.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113486 s, 92.4 MB/s 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:25.645 256+0 records in 00:06:25.645 256+0 records out 00:06:25.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202851 s, 51.7 MB/s 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:25.645 256+0 records in 00:06:25.645 256+0 records out 00:06:25.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0216849 s, 48.4 MB/s 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:25.645 13:29:13 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:25.902 13:29:13 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:26.159 13:29:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:26.416 13:29:13 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:26.416 13:29:13 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:26.673 13:29:14 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:26.673 [2024-07-15 13:29:14.286580] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:26.931 [2024-07-15 13:29:14.375754] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:26.931 [2024-07-15 13:29:14.375756] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.931 [2024-07-15 13:29:14.423869] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:26.931 [2024-07-15 13:29:14.423913] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:30.209 13:29:17 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:30.209 13:29:17 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:30.209 spdk_app_start Round 1 00:06:30.209 13:29:17 event.app_repeat -- event/event.sh@25 -- # waitforlisten 4136832 /var/tmp/spdk-nbd.sock 00:06:30.209 13:29:17 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4136832 ']' 00:06:30.209 13:29:17 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:30.209 13:29:17 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.209 13:29:17 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:30.209 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:30.210 13:29:17 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.210 13:29:17 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:30.210 13:29:17 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:30.210 13:29:17 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:30.210 13:29:17 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:30.210 Malloc0 00:06:30.210 13:29:17 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:30.210 Malloc1 00:06:30.210 13:29:17 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.210 13:29:17 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:30.210 /dev/nbd0 00:06:30.513 13:29:17 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:30.513 13:29:17 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:30.513 1+0 records in 00:06:30.513 1+0 records out 00:06:30.513 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026483 s, 15.5 MB/s 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:30.513 13:29:17 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:30.513 13:29:17 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:30.513 13:29:17 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.514 13:29:17 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:30.514 /dev/nbd1 00:06:30.514 13:29:18 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:30.514 13:29:18 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:30.514 1+0 records in 00:06:30.514 1+0 records out 00:06:30.514 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000181657 s, 22.5 MB/s 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:30.514 13:29:18 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:30.514 13:29:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:30.514 13:29:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.514 13:29:18 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:30.514 13:29:18 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.514 13:29:18 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:30.799 { 00:06:30.799 "nbd_device": "/dev/nbd0", 00:06:30.799 "bdev_name": "Malloc0" 00:06:30.799 }, 00:06:30.799 { 00:06:30.799 "nbd_device": "/dev/nbd1", 00:06:30.799 "bdev_name": "Malloc1" 00:06:30.799 } 00:06:30.799 ]' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:30.799 { 00:06:30.799 "nbd_device": "/dev/nbd0", 00:06:30.799 "bdev_name": "Malloc0" 00:06:30.799 }, 00:06:30.799 { 00:06:30.799 "nbd_device": "/dev/nbd1", 00:06:30.799 "bdev_name": "Malloc1" 00:06:30.799 } 00:06:30.799 ]' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:30.799 /dev/nbd1' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:30.799 /dev/nbd1' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:30.799 256+0 records in 00:06:30.799 256+0 records out 00:06:30.799 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010458 s, 100 MB/s 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:30.799 256+0 records in 00:06:30.799 256+0 records out 00:06:30.799 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203926 s, 51.4 MB/s 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:30.799 256+0 records in 00:06:30.799 256+0 records out 00:06:30.799 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214723 s, 48.8 MB/s 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:30.799 13:29:18 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:31.057 13:29:18 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.314 13:29:18 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:31.572 13:29:18 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:31.572 13:29:18 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:31.572 13:29:19 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:31.830 [2024-07-15 13:29:19.396368] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:32.087 [2024-07-15 13:29:19.481033] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:32.087 [2024-07-15 13:29:19.481039] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.087 [2024-07-15 13:29:19.530035] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:32.087 [2024-07-15 13:29:19.530074] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:34.612 13:29:22 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:34.612 13:29:22 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:34.612 spdk_app_start Round 2 00:06:34.612 13:29:22 event.app_repeat -- event/event.sh@25 -- # waitforlisten 4136832 /var/tmp/spdk-nbd.sock 00:06:34.612 13:29:22 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4136832 ']' 00:06:34.612 13:29:22 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:34.612 13:29:22 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.612 13:29:22 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:34.612 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:34.612 13:29:22 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.612 13:29:22 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:34.869 13:29:22 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:34.869 13:29:22 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:34.869 13:29:22 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:35.127 Malloc0 00:06:35.127 13:29:22 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:35.127 Malloc1 00:06:35.127 13:29:22 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:35.127 13:29:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.384 13:29:22 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:35.384 /dev/nbd0 00:06:35.384 13:29:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:35.384 13:29:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:35.384 1+0 records in 00:06:35.384 1+0 records out 00:06:35.384 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270072 s, 15.2 MB/s 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:35.384 13:29:22 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:35.384 13:29:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:35.384 13:29:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.384 13:29:22 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:35.642 /dev/nbd1 00:06:35.642 13:29:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:35.642 13:29:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:35.642 1+0 records in 00:06:35.642 1+0 records out 00:06:35.642 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267795 s, 15.3 MB/s 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:35.642 13:29:23 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:35.642 13:29:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:35.642 13:29:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.642 13:29:23 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:35.642 13:29:23 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.642 13:29:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:35.899 { 00:06:35.899 "nbd_device": "/dev/nbd0", 00:06:35.899 "bdev_name": "Malloc0" 00:06:35.899 }, 00:06:35.899 { 00:06:35.899 "nbd_device": "/dev/nbd1", 00:06:35.899 "bdev_name": "Malloc1" 00:06:35.899 } 00:06:35.899 ]' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:35.899 { 00:06:35.899 "nbd_device": "/dev/nbd0", 00:06:35.899 "bdev_name": "Malloc0" 00:06:35.899 }, 00:06:35.899 { 00:06:35.899 "nbd_device": "/dev/nbd1", 00:06:35.899 "bdev_name": "Malloc1" 00:06:35.899 } 00:06:35.899 ]' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:35.899 /dev/nbd1' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:35.899 /dev/nbd1' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:35.899 256+0 records in 00:06:35.899 256+0 records out 00:06:35.899 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108004 s, 97.1 MB/s 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:35.899 256+0 records in 00:06:35.899 256+0 records out 00:06:35.899 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200369 s, 52.3 MB/s 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:35.899 256+0 records in 00:06:35.899 256+0 records out 00:06:35.899 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0217519 s, 48.2 MB/s 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:35.899 13:29:23 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:36.156 13:29:23 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.413 13:29:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:36.671 13:29:24 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:36.671 13:29:24 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:36.929 13:29:24 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:36.929 [2024-07-15 13:29:24.520393] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:37.187 [2024-07-15 13:29:24.601830] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:37.187 [2024-07-15 13:29:24.601834] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.187 [2024-07-15 13:29:24.646388] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:37.187 [2024-07-15 13:29:24.646441] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:39.712 13:29:27 event.app_repeat -- event/event.sh@38 -- # waitforlisten 4136832 /var/tmp/spdk-nbd.sock 00:06:39.712 13:29:27 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4136832 ']' 00:06:39.712 13:29:27 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:39.712 13:29:27 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:39.712 13:29:27 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:39.712 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:39.712 13:29:27 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:39.712 13:29:27 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:39.968 13:29:27 event.app_repeat -- event/event.sh@39 -- # killprocess 4136832 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 4136832 ']' 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 4136832 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4136832 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4136832' 00:06:39.968 killing process with pid 4136832 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@967 -- # kill 4136832 00:06:39.968 13:29:27 event.app_repeat -- common/autotest_common.sh@972 -- # wait 4136832 00:06:40.225 spdk_app_start is called in Round 0. 00:06:40.225 Shutdown signal received, stop current app iteration 00:06:40.225 Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 reinitialization... 00:06:40.225 spdk_app_start is called in Round 1. 00:06:40.225 Shutdown signal received, stop current app iteration 00:06:40.225 Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 reinitialization... 00:06:40.225 spdk_app_start is called in Round 2. 00:06:40.225 Shutdown signal received, stop current app iteration 00:06:40.225 Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 reinitialization... 00:06:40.225 spdk_app_start is called in Round 3. 00:06:40.225 Shutdown signal received, stop current app iteration 00:06:40.225 13:29:27 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:40.225 13:29:27 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:40.225 00:06:40.225 real 0m16.443s 00:06:40.225 user 0m34.883s 00:06:40.225 sys 0m3.196s 00:06:40.225 13:29:27 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:40.225 13:29:27 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:40.225 ************************************ 00:06:40.225 END TEST app_repeat 00:06:40.226 ************************************ 00:06:40.226 13:29:27 event -- common/autotest_common.sh@1142 -- # return 0 00:06:40.226 13:29:27 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:40.226 00:06:40.226 real 0m25.240s 00:06:40.226 user 0m49.669s 00:06:40.226 sys 0m4.394s 00:06:40.226 13:29:27 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:40.226 13:29:27 event -- common/autotest_common.sh@10 -- # set +x 00:06:40.226 ************************************ 00:06:40.226 END TEST event 00:06:40.226 ************************************ 00:06:40.226 13:29:27 -- common/autotest_common.sh@1142 -- # return 0 00:06:40.226 13:29:27 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:40.226 13:29:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:40.226 13:29:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.226 13:29:27 -- common/autotest_common.sh@10 -- # set +x 00:06:40.482 ************************************ 00:06:40.482 START TEST thread 00:06:40.482 ************************************ 00:06:40.482 13:29:27 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:40.482 * Looking for test storage... 00:06:40.482 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:06:40.482 13:29:27 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:40.482 13:29:27 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:06:40.482 13:29:27 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.482 13:29:27 thread -- common/autotest_common.sh@10 -- # set +x 00:06:40.482 ************************************ 00:06:40.482 START TEST thread_poller_perf 00:06:40.482 ************************************ 00:06:40.482 13:29:27 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:40.482 [2024-07-15 13:29:28.006918] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:40.482 [2024-07-15 13:29:28.006987] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4139245 ] 00:06:40.482 [2024-07-15 13:29:28.098053] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.740 [2024-07-15 13:29:28.188261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.740 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:41.672 ====================================== 00:06:41.672 busy:2306947494 (cyc) 00:06:41.672 total_run_count: 425000 00:06:41.672 tsc_hz: 2300000000 (cyc) 00:06:41.672 ====================================== 00:06:41.672 poller_cost: 5428 (cyc), 2360 (nsec) 00:06:41.672 00:06:41.672 real 0m1.298s 00:06:41.672 user 0m1.189s 00:06:41.672 sys 0m0.104s 00:06:41.672 13:29:29 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:41.672 13:29:29 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:41.672 ************************************ 00:06:41.672 END TEST thread_poller_perf 00:06:41.672 ************************************ 00:06:41.930 13:29:29 thread -- common/autotest_common.sh@1142 -- # return 0 00:06:41.930 13:29:29 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:41.930 13:29:29 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:06:41.930 13:29:29 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.930 13:29:29 thread -- common/autotest_common.sh@10 -- # set +x 00:06:41.930 ************************************ 00:06:41.930 START TEST thread_poller_perf 00:06:41.930 ************************************ 00:06:41.930 13:29:29 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:41.930 [2024-07-15 13:29:29.382974] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:41.930 [2024-07-15 13:29:29.383048] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4139454 ] 00:06:41.930 [2024-07-15 13:29:29.470723] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.187 [2024-07-15 13:29:29.558420] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.187 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:43.118 ====================================== 00:06:43.118 busy:2301812586 (cyc) 00:06:43.118 total_run_count: 5486000 00:06:43.118 tsc_hz: 2300000000 (cyc) 00:06:43.118 ====================================== 00:06:43.118 poller_cost: 419 (cyc), 182 (nsec) 00:06:43.118 00:06:43.118 real 0m1.280s 00:06:43.118 user 0m1.168s 00:06:43.118 sys 0m0.108s 00:06:43.118 13:29:30 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:43.118 13:29:30 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:43.118 ************************************ 00:06:43.118 END TEST thread_poller_perf 00:06:43.118 ************************************ 00:06:43.118 13:29:30 thread -- common/autotest_common.sh@1142 -- # return 0 00:06:43.118 13:29:30 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:43.118 00:06:43.118 real 0m2.825s 00:06:43.118 user 0m2.440s 00:06:43.118 sys 0m0.395s 00:06:43.118 13:29:30 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:43.118 13:29:30 thread -- common/autotest_common.sh@10 -- # set +x 00:06:43.118 ************************************ 00:06:43.118 END TEST thread 00:06:43.118 ************************************ 00:06:43.118 13:29:30 -- common/autotest_common.sh@1142 -- # return 0 00:06:43.118 13:29:30 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:43.118 13:29:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:43.118 13:29:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.118 13:29:30 -- common/autotest_common.sh@10 -- # set +x 00:06:43.375 ************************************ 00:06:43.375 START TEST accel 00:06:43.375 ************************************ 00:06:43.375 13:29:30 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:43.375 * Looking for test storage... 00:06:43.375 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:43.375 13:29:30 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:43.375 13:29:30 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:06:43.375 13:29:30 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:43.375 13:29:30 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=4139780 00:06:43.375 13:29:30 accel -- accel/accel.sh@63 -- # waitforlisten 4139780 00:06:43.375 13:29:30 accel -- common/autotest_common.sh@829 -- # '[' -z 4139780 ']' 00:06:43.375 13:29:30 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:43.375 13:29:30 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:43.375 13:29:30 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:43.375 13:29:30 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:43.375 13:29:30 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:43.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:43.375 13:29:30 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:43.375 13:29:30 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:43.375 13:29:30 accel -- common/autotest_common.sh@10 -- # set +x 00:06:43.375 13:29:30 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:43.375 13:29:30 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.375 13:29:30 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.375 13:29:30 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:43.375 13:29:30 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:43.375 13:29:30 accel -- accel/accel.sh@41 -- # jq -r . 00:06:43.375 [2024-07-15 13:29:30.933193] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:43.375 [2024-07-15 13:29:30.933263] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4139780 ] 00:06:43.633 [2024-07-15 13:29:31.021413] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.633 [2024-07-15 13:29:31.104066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@862 -- # return 0 00:06:44.197 13:29:31 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:44.197 13:29:31 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:44.197 13:29:31 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:44.197 13:29:31 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:44.197 13:29:31 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:44.197 13:29:31 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:44.197 13:29:31 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@10 -- # set +x 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # IFS== 00:06:44.197 13:29:31 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:44.197 13:29:31 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:44.197 13:29:31 accel -- accel/accel.sh@75 -- # killprocess 4139780 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@948 -- # '[' -z 4139780 ']' 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@952 -- # kill -0 4139780 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@953 -- # uname 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:44.197 13:29:31 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4139780 00:06:44.454 13:29:31 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:44.454 13:29:31 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:44.454 13:29:31 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4139780' 00:06:44.454 killing process with pid 4139780 00:06:44.454 13:29:31 accel -- common/autotest_common.sh@967 -- # kill 4139780 00:06:44.454 13:29:31 accel -- common/autotest_common.sh@972 -- # wait 4139780 00:06:44.712 13:29:32 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:44.712 13:29:32 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:44.712 13:29:32 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:06:44.712 13:29:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.712 13:29:32 accel -- common/autotest_common.sh@10 -- # set +x 00:06:44.712 13:29:32 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:06:44.712 13:29:32 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:06:44.712 13:29:32 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:44.712 13:29:32 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:06:44.712 13:29:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:44.712 13:29:32 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:44.712 13:29:32 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:44.712 13:29:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.712 13:29:32 accel -- common/autotest_common.sh@10 -- # set +x 00:06:44.969 ************************************ 00:06:44.969 START TEST accel_missing_filename 00:06:44.969 ************************************ 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:44.969 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:06:44.969 13:29:32 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:06:44.969 [2024-07-15 13:29:32.370737] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:44.969 [2024-07-15 13:29:32.370791] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4139998 ] 00:06:44.969 [2024-07-15 13:29:32.457752] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.969 [2024-07-15 13:29:32.547758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.226 [2024-07-15 13:29:32.612226] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:45.226 [2024-07-15 13:29:32.682337] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:45.226 A filename is required. 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:45.226 00:06:45.226 real 0m0.433s 00:06:45.226 user 0m0.300s 00:06:45.226 sys 0m0.153s 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:45.226 13:29:32 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:06:45.226 ************************************ 00:06:45.226 END TEST accel_missing_filename 00:06:45.226 ************************************ 00:06:45.226 13:29:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:45.226 13:29:32 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:45.226 13:29:32 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:06:45.226 13:29:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.226 13:29:32 accel -- common/autotest_common.sh@10 -- # set +x 00:06:45.484 ************************************ 00:06:45.484 START TEST accel_compress_verify 00:06:45.484 ************************************ 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:45.484 13:29:32 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:45.484 13:29:32 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:06:45.484 [2024-07-15 13:29:32.882409] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:45.484 [2024-07-15 13:29:32.882477] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4140026 ] 00:06:45.484 [2024-07-15 13:29:32.969222] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.484 [2024-07-15 13:29:33.054434] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.740 [2024-07-15 13:29:33.116720] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:45.740 [2024-07-15 13:29:33.186630] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:45.740 00:06:45.740 Compression does not support the verify option, aborting. 00:06:45.740 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:06:45.741 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:45.741 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:06:45.741 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:06:45.741 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:06:45.741 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:45.741 00:06:45.741 real 0m0.425s 00:06:45.741 user 0m0.289s 00:06:45.741 sys 0m0.162s 00:06:45.741 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:45.741 13:29:33 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:06:45.741 ************************************ 00:06:45.741 END TEST accel_compress_verify 00:06:45.741 ************************************ 00:06:45.741 13:29:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:45.741 13:29:33 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:45.741 13:29:33 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:45.741 13:29:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.741 13:29:33 accel -- common/autotest_common.sh@10 -- # set +x 00:06:45.741 ************************************ 00:06:45.741 START TEST accel_wrong_workload 00:06:45.741 ************************************ 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:45.741 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:06:45.741 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:06:45.997 13:29:33 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:06:45.997 Unsupported workload type: foobar 00:06:45.997 [2024-07-15 13:29:33.385049] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:45.997 accel_perf options: 00:06:45.997 [-h help message] 00:06:45.997 [-q queue depth per core] 00:06:45.997 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:45.997 [-T number of threads per core 00:06:45.997 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:45.997 [-t time in seconds] 00:06:45.997 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:45.997 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:45.997 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:45.997 [-l for compress/decompress workloads, name of uncompressed input file 00:06:45.997 [-S for crc32c workload, use this seed value (default 0) 00:06:45.997 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:45.997 [-f for fill workload, use this BYTE value (default 255) 00:06:45.997 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:45.997 [-y verify result if this switch is on] 00:06:45.997 [-a tasks to allocate per core (default: same value as -q)] 00:06:45.997 Can be used to spread operations across a wider range of memory. 00:06:45.997 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:06:45.997 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:45.997 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:45.997 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:45.997 00:06:45.997 real 0m0.039s 00:06:45.997 user 0m0.021s 00:06:45.997 sys 0m0.018s 00:06:45.997 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:45.997 13:29:33 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:06:45.997 ************************************ 00:06:45.997 END TEST accel_wrong_workload 00:06:45.997 ************************************ 00:06:45.997 Error: writing output failed: Broken pipe 00:06:45.997 13:29:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:45.997 13:29:33 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:45.997 13:29:33 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:06:45.997 13:29:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.997 13:29:33 accel -- common/autotest_common.sh@10 -- # set +x 00:06:45.997 ************************************ 00:06:45.997 START TEST accel_negative_buffers 00:06:45.997 ************************************ 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:06:45.998 13:29:33 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:06:45.998 -x option must be non-negative. 00:06:45.998 [2024-07-15 13:29:33.502947] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:45.998 accel_perf options: 00:06:45.998 [-h help message] 00:06:45.998 [-q queue depth per core] 00:06:45.998 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:45.998 [-T number of threads per core 00:06:45.998 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:45.998 [-t time in seconds] 00:06:45.998 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:45.998 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:45.998 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:45.998 [-l for compress/decompress workloads, name of uncompressed input file 00:06:45.998 [-S for crc32c workload, use this seed value (default 0) 00:06:45.998 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:45.998 [-f for fill workload, use this BYTE value (default 255) 00:06:45.998 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:45.998 [-y verify result if this switch is on] 00:06:45.998 [-a tasks to allocate per core (default: same value as -q)] 00:06:45.998 Can be used to spread operations across a wider range of memory. 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:45.998 00:06:45.998 real 0m0.034s 00:06:45.998 user 0m0.021s 00:06:45.998 sys 0m0.013s 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:45.998 13:29:33 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:06:45.998 ************************************ 00:06:45.998 END TEST accel_negative_buffers 00:06:45.998 ************************************ 00:06:45.998 Error: writing output failed: Broken pipe 00:06:45.998 13:29:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:45.998 13:29:33 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:45.998 13:29:33 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:45.998 13:29:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.998 13:29:33 accel -- common/autotest_common.sh@10 -- # set +x 00:06:45.998 ************************************ 00:06:45.998 START TEST accel_crc32c 00:06:45.998 ************************************ 00:06:45.998 13:29:33 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:45.998 13:29:33 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:46.255 [2024-07-15 13:29:33.621252] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:46.256 [2024-07-15 13:29:33.621317] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4140249 ] 00:06:46.256 [2024-07-15 13:29:33.709078] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.256 [2024-07-15 13:29:33.795658] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.256 13:29:33 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:47.628 13:29:35 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.628 00:06:47.628 real 0m1.427s 00:06:47.628 user 0m1.264s 00:06:47.628 sys 0m0.158s 00:06:47.628 13:29:35 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:47.628 13:29:35 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:47.628 ************************************ 00:06:47.628 END TEST accel_crc32c 00:06:47.628 ************************************ 00:06:47.629 13:29:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:47.629 13:29:35 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:47.629 13:29:35 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:47.629 13:29:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.629 13:29:35 accel -- common/autotest_common.sh@10 -- # set +x 00:06:47.629 ************************************ 00:06:47.629 START TEST accel_crc32c_C2 00:06:47.629 ************************************ 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:47.629 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:47.629 [2024-07-15 13:29:35.113926] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:47.629 [2024-07-15 13:29:35.113974] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4140450 ] 00:06:47.629 [2024-07-15 13:29:35.200942] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.886 [2024-07-15 13:29:35.285696] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.886 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.887 13:29:35 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.258 00:06:49.258 real 0m1.400s 00:06:49.258 user 0m1.248s 00:06:49.258 sys 0m0.148s 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:49.258 13:29:36 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:49.258 ************************************ 00:06:49.258 END TEST accel_crc32c_C2 00:06:49.258 ************************************ 00:06:49.258 13:29:36 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:49.258 13:29:36 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:49.258 13:29:36 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:49.258 13:29:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.258 13:29:36 accel -- common/autotest_common.sh@10 -- # set +x 00:06:49.258 ************************************ 00:06:49.258 START TEST accel_copy 00:06:49.258 ************************************ 00:06:49.258 13:29:36 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:49.258 13:29:36 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:06:49.259 [2024-07-15 13:29:36.597168] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:49.259 [2024-07-15 13:29:36.597223] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4140647 ] 00:06:49.259 [2024-07-15 13:29:36.681656] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.259 [2024-07-15 13:29:36.766230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:49.259 13:29:36 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:50.631 13:29:37 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.631 00:06:50.631 real 0m1.416s 00:06:50.631 user 0m0.006s 00:06:50.631 sys 0m0.002s 00:06:50.631 13:29:37 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.631 13:29:37 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:06:50.631 ************************************ 00:06:50.631 END TEST accel_copy 00:06:50.631 ************************************ 00:06:50.631 13:29:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:50.631 13:29:38 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.631 13:29:38 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:06:50.631 13:29:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.631 13:29:38 accel -- common/autotest_common.sh@10 -- # set +x 00:06:50.631 ************************************ 00:06:50.631 START TEST accel_fill 00:06:50.631 ************************************ 00:06:50.631 13:29:38 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:06:50.631 13:29:38 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:06:50.631 [2024-07-15 13:29:38.077653] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:50.631 [2024-07-15 13:29:38.077710] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4140844 ] 00:06:50.631 [2024-07-15 13:29:38.163569] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.631 [2024-07-15 13:29:38.248721] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:50.890 13:29:38 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:52.266 13:29:39 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.266 00:06:52.266 real 0m1.424s 00:06:52.266 user 0m1.258s 00:06:52.266 sys 0m0.160s 00:06:52.266 13:29:39 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:52.266 13:29:39 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:06:52.266 ************************************ 00:06:52.266 END TEST accel_fill 00:06:52.266 ************************************ 00:06:52.266 13:29:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:52.266 13:29:39 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:52.266 13:29:39 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:52.266 13:29:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:52.266 13:29:39 accel -- common/autotest_common.sh@10 -- # set +x 00:06:52.266 ************************************ 00:06:52.266 START TEST accel_copy_crc32c 00:06:52.266 ************************************ 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:52.266 [2024-07-15 13:29:39.555309] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:52.266 [2024-07-15 13:29:39.555355] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4141041 ] 00:06:52.266 [2024-07-15 13:29:39.640562] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.266 [2024-07-15 13:29:39.724950] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:52.266 13:29:39 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.641 00:06:53.641 real 0m1.417s 00:06:53.641 user 0m1.262s 00:06:53.641 sys 0m0.152s 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:53.641 13:29:40 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:53.641 ************************************ 00:06:53.641 END TEST accel_copy_crc32c 00:06:53.641 ************************************ 00:06:53.641 13:29:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:53.641 13:29:40 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:53.641 13:29:40 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:53.641 13:29:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.641 13:29:40 accel -- common/autotest_common.sh@10 -- # set +x 00:06:53.641 ************************************ 00:06:53.641 START TEST accel_copy_crc32c_C2 00:06:53.641 ************************************ 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:53.641 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:53.641 [2024-07-15 13:29:41.058941] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:53.641 [2024-07-15 13:29:41.059006] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4141234 ] 00:06:53.641 [2024-07-15 13:29:41.148408] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.641 [2024-07-15 13:29:41.243445] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.899 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.900 13:29:41 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.845 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.163 00:06:55.163 real 0m1.433s 00:06:55.163 user 0m1.267s 00:06:55.163 sys 0m0.165s 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:55.163 13:29:42 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:55.163 ************************************ 00:06:55.163 END TEST accel_copy_crc32c_C2 00:06:55.163 ************************************ 00:06:55.163 13:29:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:55.163 13:29:42 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:55.163 13:29:42 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:55.163 13:29:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.163 13:29:42 accel -- common/autotest_common.sh@10 -- # set +x 00:06:55.163 ************************************ 00:06:55.163 START TEST accel_dualcast 00:06:55.163 ************************************ 00:06:55.163 13:29:42 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:06:55.163 13:29:42 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:06:55.163 [2024-07-15 13:29:42.567652] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:55.163 [2024-07-15 13:29:42.567711] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4141439 ] 00:06:55.163 [2024-07-15 13:29:42.651878] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.163 [2024-07-15 13:29:42.736121] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:55.422 13:29:42 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:06:56.356 13:29:43 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.356 00:06:56.356 real 0m1.413s 00:06:56.356 user 0m1.246s 00:06:56.356 sys 0m0.159s 00:06:56.356 13:29:43 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:56.356 13:29:43 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:06:56.356 ************************************ 00:06:56.356 END TEST accel_dualcast 00:06:56.356 ************************************ 00:06:56.614 13:29:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:56.614 13:29:43 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:56.614 13:29:43 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:56.614 13:29:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.614 13:29:43 accel -- common/autotest_common.sh@10 -- # set +x 00:06:56.614 ************************************ 00:06:56.614 START TEST accel_compare 00:06:56.614 ************************************ 00:06:56.614 13:29:44 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:06:56.614 13:29:44 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:06:56.614 13:29:44 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:06:56.614 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.614 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.614 13:29:44 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:06:56.615 13:29:44 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:06:56.615 [2024-07-15 13:29:44.038728] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:56.615 [2024-07-15 13:29:44.038773] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4141635 ] 00:06:56.615 [2024-07-15 13:29:44.121158] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.615 [2024-07-15 13:29:44.208100] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:56.873 13:29:44 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:56.874 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:56.874 13:29:44 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:06:58.251 13:29:45 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.251 00:06:58.251 real 0m1.413s 00:06:58.251 user 0m1.252s 00:06:58.251 sys 0m0.152s 00:06:58.251 13:29:45 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.251 13:29:45 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:06:58.251 ************************************ 00:06:58.251 END TEST accel_compare 00:06:58.251 ************************************ 00:06:58.251 13:29:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:58.251 13:29:45 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:58.251 13:29:45 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:58.251 13:29:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.251 13:29:45 accel -- common/autotest_common.sh@10 -- # set +x 00:06:58.251 ************************************ 00:06:58.251 START TEST accel_xor 00:06:58.251 ************************************ 00:06:58.251 13:29:45 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:58.251 [2024-07-15 13:29:45.539544] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:58.251 [2024-07-15 13:29:45.539613] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4141872 ] 00:06:58.251 [2024-07-15 13:29:45.628947] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.251 [2024-07-15 13:29:45.715212] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:58.251 13:29:45 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:59.628 13:29:46 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.628 00:06:59.628 real 0m1.434s 00:06:59.628 user 0m1.266s 00:06:59.628 sys 0m0.160s 00:06:59.628 13:29:46 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:59.628 13:29:46 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:59.628 ************************************ 00:06:59.628 END TEST accel_xor 00:06:59.628 ************************************ 00:06:59.628 13:29:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:59.628 13:29:46 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:59.628 13:29:46 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:59.628 13:29:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.628 13:29:46 accel -- common/autotest_common.sh@10 -- # set +x 00:06:59.628 ************************************ 00:06:59.628 START TEST accel_xor 00:06:59.628 ************************************ 00:06:59.628 13:29:47 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:59.628 13:29:47 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:59.628 [2024-07-15 13:29:47.045413] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:06:59.628 [2024-07-15 13:29:47.045475] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4142134 ] 00:06:59.628 [2024-07-15 13:29:47.131761] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.628 [2024-07-15 13:29:47.217103] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:59.887 13:29:47 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:00.823 13:29:48 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.823 00:07:00.823 real 0m1.426s 00:07:00.823 user 0m1.259s 00:07:00.823 sys 0m0.164s 00:07:00.823 13:29:48 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:00.823 13:29:48 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:00.823 ************************************ 00:07:00.823 END TEST accel_xor 00:07:00.823 ************************************ 00:07:01.082 13:29:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:01.082 13:29:48 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:01.082 13:29:48 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:01.082 13:29:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.082 13:29:48 accel -- common/autotest_common.sh@10 -- # set +x 00:07:01.082 ************************************ 00:07:01.082 START TEST accel_dif_verify 00:07:01.082 ************************************ 00:07:01.082 13:29:48 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:01.082 13:29:48 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:01.082 [2024-07-15 13:29:48.552232] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:01.082 [2024-07-15 13:29:48.552290] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4142387 ] 00:07:01.082 [2024-07-15 13:29:48.639636] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.342 [2024-07-15 13:29:48.734822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:01.342 13:29:48 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:02.721 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:49 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:49 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:49 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:49 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.722 13:29:49 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:02.722 13:29:49 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.722 00:07:02.722 real 0m1.419s 00:07:02.722 user 0m1.264s 00:07:02.722 sys 0m0.154s 00:07:02.722 13:29:49 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:02.722 13:29:49 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:02.722 ************************************ 00:07:02.722 END TEST accel_dif_verify 00:07:02.722 ************************************ 00:07:02.722 13:29:49 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:02.722 13:29:49 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:02.722 13:29:49 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:02.722 13:29:49 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.722 13:29:49 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.722 ************************************ 00:07:02.722 START TEST accel_dif_generate 00:07:02.722 ************************************ 00:07:02.722 13:29:50 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:02.722 [2024-07-15 13:29:50.031775] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:02.722 [2024-07-15 13:29:50.031825] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4142584 ] 00:07:02.722 [2024-07-15 13:29:50.115042] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.722 [2024-07-15 13:29:50.200741] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:02.722 13:29:50 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:04.100 13:29:51 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.100 00:07:04.100 real 0m1.404s 00:07:04.100 user 0m1.257s 00:07:04.100 sys 0m0.146s 00:07:04.100 13:29:51 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:04.100 13:29:51 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:04.100 ************************************ 00:07:04.100 END TEST accel_dif_generate 00:07:04.100 ************************************ 00:07:04.100 13:29:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:04.100 13:29:51 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:04.100 13:29:51 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:04.100 13:29:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.100 13:29:51 accel -- common/autotest_common.sh@10 -- # set +x 00:07:04.100 ************************************ 00:07:04.100 START TEST accel_dif_generate_copy 00:07:04.100 ************************************ 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:04.100 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:07:04.100 [2024-07-15 13:29:51.506760] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:04.100 [2024-07-15 13:29:51.506803] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4142781 ] 00:07:04.100 [2024-07-15 13:29:51.591171] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.100 [2024-07-15 13:29:51.675392] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:04.359 13:29:51 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:05.293 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:05.293 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:05.293 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:05.293 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:05.293 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:05.293 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:05.293 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.294 00:07:05.294 real 0m1.403s 00:07:05.294 user 0m1.252s 00:07:05.294 sys 0m0.146s 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:05.294 13:29:52 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:07:05.294 ************************************ 00:07:05.294 END TEST accel_dif_generate_copy 00:07:05.294 ************************************ 00:07:05.553 13:29:52 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:05.553 13:29:52 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:05.553 13:29:52 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:05.553 13:29:52 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:05.553 13:29:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.553 13:29:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:05.553 ************************************ 00:07:05.553 START TEST accel_comp 00:07:05.553 ************************************ 00:07:05.553 13:29:52 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:05.553 13:29:52 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:07:05.553 [2024-07-15 13:29:53.003277] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:05.553 [2024-07-15 13:29:53.003328] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4142976 ] 00:07:05.553 [2024-07-15 13:29:53.090435] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.811 [2024-07-15 13:29:53.177613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.811 13:29:53 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:07.188 13:29:54 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.188 00:07:07.188 real 0m1.423s 00:07:07.188 user 0m1.269s 00:07:07.188 sys 0m0.153s 00:07:07.188 13:29:54 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:07.188 13:29:54 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:07:07.188 ************************************ 00:07:07.188 END TEST accel_comp 00:07:07.188 ************************************ 00:07:07.188 13:29:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:07.188 13:29:54 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:07.188 13:29:54 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:07.188 13:29:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.188 13:29:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:07.188 ************************************ 00:07:07.188 START TEST accel_decomp 00:07:07.188 ************************************ 00:07:07.188 13:29:54 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:07.188 13:29:54 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:07.188 13:29:54 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:07.189 [2024-07-15 13:29:54.512621] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:07.189 [2024-07-15 13:29:54.512685] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4143168 ] 00:07:07.189 [2024-07-15 13:29:54.602122] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.189 [2024-07-15 13:29:54.685090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.189 13:29:54 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:08.567 13:29:55 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.567 00:07:08.567 real 0m1.415s 00:07:08.567 user 0m1.256s 00:07:08.567 sys 0m0.155s 00:07:08.567 13:29:55 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.567 13:29:55 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:08.567 ************************************ 00:07:08.567 END TEST accel_decomp 00:07:08.567 ************************************ 00:07:08.567 13:29:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:08.567 13:29:55 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.567 13:29:55 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:08.567 13:29:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.567 13:29:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:08.567 ************************************ 00:07:08.567 START TEST accel_decomp_full 00:07:08.567 ************************************ 00:07:08.567 13:29:55 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:08.567 13:29:55 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:08.567 [2024-07-15 13:29:55.985609] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:08.567 [2024-07-15 13:29:55.985669] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4143370 ] 00:07:08.567 [2024-07-15 13:29:56.069285] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.567 [2024-07-15 13:29:56.153263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.827 13:29:56 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:09.766 13:29:57 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.766 00:07:09.766 real 0m1.420s 00:07:09.766 user 0m1.262s 00:07:09.766 sys 0m0.156s 00:07:09.766 13:29:57 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.766 13:29:57 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:09.766 ************************************ 00:07:09.766 END TEST accel_decomp_full 00:07:09.766 ************************************ 00:07:10.026 13:29:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:10.026 13:29:57 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.026 13:29:57 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:10.026 13:29:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.026 13:29:57 accel -- common/autotest_common.sh@10 -- # set +x 00:07:10.026 ************************************ 00:07:10.026 START TEST accel_decomp_mcore 00:07:10.026 ************************************ 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:10.026 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:10.026 [2024-07-15 13:29:57.488294] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:10.026 [2024-07-15 13:29:57.488359] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4143566 ] 00:07:10.026 [2024-07-15 13:29:57.575864] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:10.285 [2024-07-15 13:29:57.671087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.285 [2024-07-15 13:29:57.671175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:10.285 [2024-07-15 13:29:57.671251] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:10.285 [2024-07-15 13:29:57.671253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.285 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.286 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.286 13:29:57 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:11.660 00:07:11.660 real 0m1.458s 00:07:11.660 user 0m4.717s 00:07:11.660 sys 0m0.171s 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:11.660 13:29:58 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:11.660 ************************************ 00:07:11.660 END TEST accel_decomp_mcore 00:07:11.660 ************************************ 00:07:11.660 13:29:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:11.660 13:29:58 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.660 13:29:58 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:11.660 13:29:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.660 13:29:58 accel -- common/autotest_common.sh@10 -- # set +x 00:07:11.660 ************************************ 00:07:11.660 START TEST accel_decomp_full_mcore 00:07:11.660 ************************************ 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:11.661 13:29:58 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:11.661 [2024-07-15 13:29:59.028689] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:11.661 [2024-07-15 13:29:59.028750] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4143794 ] 00:07:11.661 [2024-07-15 13:29:59.115932] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:11.661 [2024-07-15 13:29:59.204321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.661 [2024-07-15 13:29:59.204408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:11.661 [2024-07-15 13:29:59.204486] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:11.661 [2024-07-15 13:29:59.204488] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.661 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.919 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.920 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.920 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.920 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.920 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.920 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.920 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.920 13:29:59 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.854 00:07:12.854 real 0m1.461s 00:07:12.854 user 0m4.753s 00:07:12.854 sys 0m0.166s 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.854 13:30:00 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:12.854 ************************************ 00:07:12.854 END TEST accel_decomp_full_mcore 00:07:12.854 ************************************ 00:07:13.113 13:30:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:13.113 13:30:00 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.113 13:30:00 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:13.113 13:30:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.113 13:30:00 accel -- common/autotest_common.sh@10 -- # set +x 00:07:13.113 ************************************ 00:07:13.113 START TEST accel_decomp_mthread 00:07:13.113 ************************************ 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:13.113 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:13.113 [2024-07-15 13:30:00.572330] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:13.113 [2024-07-15 13:30:00.572392] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4144122 ] 00:07:13.113 [2024-07-15 13:30:00.660336] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.371 [2024-07-15 13:30:00.751860] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.371 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.372 13:30:00 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.744 00:07:14.744 real 0m1.440s 00:07:14.744 user 0m1.290s 00:07:14.744 sys 0m0.150s 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.744 13:30:01 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:14.744 ************************************ 00:07:14.744 END TEST accel_decomp_mthread 00:07:14.744 ************************************ 00:07:14.744 13:30:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:14.744 13:30:02 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.744 13:30:02 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:14.744 13:30:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.744 13:30:02 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.744 ************************************ 00:07:14.744 START TEST accel_decomp_full_mthread 00:07:14.744 ************************************ 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:14.744 [2024-07-15 13:30:02.086174] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:14.744 [2024-07-15 13:30:02.086232] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4144451 ] 00:07:14.744 [2024-07-15 13:30:02.172867] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.744 [2024-07-15 13:30:02.260166] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.744 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.745 13:30:02 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:16.136 00:07:16.136 real 0m1.458s 00:07:16.136 user 0m1.304s 00:07:16.136 sys 0m0.156s 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.136 13:30:03 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:16.136 ************************************ 00:07:16.136 END TEST accel_decomp_full_mthread 00:07:16.136 ************************************ 00:07:16.136 13:30:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:16.136 13:30:03 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:16.136 13:30:03 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:16.136 13:30:03 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:16.136 13:30:03 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:16.136 13:30:03 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=4144639 00:07:16.136 13:30:03 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:16.136 13:30:03 accel -- accel/accel.sh@63 -- # waitforlisten 4144639 00:07:16.136 13:30:03 accel -- common/autotest_common.sh@829 -- # '[' -z 4144639 ']' 00:07:16.136 13:30:03 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:16.136 13:30:03 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:16.136 13:30:03 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:16.136 13:30:03 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:16.136 13:30:03 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:16.136 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:16.136 13:30:03 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:16.136 13:30:03 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:16.136 13:30:03 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.136 13:30:03 accel -- common/autotest_common.sh@10 -- # set +x 00:07:16.136 13:30:03 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.136 13:30:03 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:16.136 13:30:03 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:16.136 13:30:03 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:16.136 13:30:03 accel -- accel/accel.sh@41 -- # jq -r . 00:07:16.136 [2024-07-15 13:30:03.596534] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:16.136 [2024-07-15 13:30:03.596584] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4144639 ] 00:07:16.136 [2024-07-15 13:30:03.683934] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.394 [2024-07-15 13:30:03.773627] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.959 [2024-07-15 13:30:04.342699] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:16.959 13:30:04 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:16.959 13:30:04 accel -- common/autotest_common.sh@862 -- # return 0 00:07:16.959 13:30:04 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:16.959 13:30:04 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:16.959 13:30:04 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:16.959 13:30:04 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:16.959 13:30:04 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:16.959 13:30:04 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:16.959 13:30:04 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:16.959 13:30:04 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:16.959 13:30:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:16.959 13:30:04 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.217 "method": "compressdev_scan_accel_module", 00:07:17.217 13:30:04 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:17.217 13:30:04 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:17.217 13:30:04 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # IFS== 00:07:17.217 13:30:04 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:17.217 13:30:04 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:17.217 13:30:04 accel -- accel/accel.sh@75 -- # killprocess 4144639 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@948 -- # '[' -z 4144639 ']' 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@952 -- # kill -0 4144639 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@953 -- # uname 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4144639 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4144639' 00:07:17.217 killing process with pid 4144639 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@967 -- # kill 4144639 00:07:17.217 13:30:04 accel -- common/autotest_common.sh@972 -- # wait 4144639 00:07:17.784 13:30:05 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:17.784 13:30:05 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:17.784 13:30:05 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:17.784 13:30:05 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.784 13:30:05 accel -- common/autotest_common.sh@10 -- # set +x 00:07:17.784 ************************************ 00:07:17.784 START TEST accel_cdev_comp 00:07:17.784 ************************************ 00:07:17.784 13:30:05 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:17.784 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:17.784 [2024-07-15 13:30:05.210090] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:17.784 [2024-07-15 13:30:05.210149] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4145123 ] 00:07:17.784 [2024-07-15 13:30:05.299019] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.784 [2024-07-15 13:30:05.385273] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.351 [2024-07-15 13:30:05.946541] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:18.351 [2024-07-15 13:30:05.948424] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1207f00 PMD being used: compress_qat 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 [2024-07-15 13:30:05.951816] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x120ccb0 PMD being used: compress_qat 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:18.351 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:18.352 13:30:05 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:19.728 13:30:07 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:19.728 00:07:19.728 real 0m1.941s 00:07:19.728 user 0m1.504s 00:07:19.728 sys 0m0.435s 00:07:19.728 13:30:07 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:19.728 13:30:07 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:19.728 ************************************ 00:07:19.728 END TEST accel_cdev_comp 00:07:19.728 ************************************ 00:07:19.728 13:30:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:19.728 13:30:07 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:19.728 13:30:07 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:19.728 13:30:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.728 13:30:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:19.728 ************************************ 00:07:19.728 START TEST accel_cdev_decomp 00:07:19.728 ************************************ 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:19.728 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:19.728 [2024-07-15 13:30:07.229389] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:19.728 [2024-07-15 13:30:07.229443] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4145529 ] 00:07:19.729 [2024-07-15 13:30:07.316724] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.014 [2024-07-15 13:30:07.408942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.593 [2024-07-15 13:30:07.944688] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:20.593 [2024-07-15 13:30:07.946566] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa8ff00 PMD being used: compress_qat 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 [2024-07-15 13:30:07.950226] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa94cb0 PMD being used: compress_qat 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.593 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.594 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.594 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.594 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:20.594 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:20.594 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:20.594 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:20.594 13:30:07 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:21.528 00:07:21.528 real 0m1.918s 00:07:21.528 user 0m1.498s 00:07:21.528 sys 0m0.426s 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.528 13:30:09 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:21.528 ************************************ 00:07:21.528 END TEST accel_cdev_decomp 00:07:21.528 ************************************ 00:07:21.787 13:30:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:21.787 13:30:09 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.787 13:30:09 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:21.787 13:30:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.787 13:30:09 accel -- common/autotest_common.sh@10 -- # set +x 00:07:21.787 ************************************ 00:07:21.787 START TEST accel_cdev_decomp_full 00:07:21.787 ************************************ 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:21.787 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:21.787 [2024-07-15 13:30:09.230752] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:21.787 [2024-07-15 13:30:09.230815] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4145790 ] 00:07:21.787 [2024-07-15 13:30:09.314903] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.787 [2024-07-15 13:30:09.396951] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.354 [2024-07-15 13:30:09.921935] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:22.354 [2024-07-15 13:30:09.923851] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23c7f00 PMD being used: compress_qat 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.354 [2024-07-15 13:30:09.926645] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23c7fa0 PMD being used: compress_qat 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.354 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:22.355 13:30:09 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:23.729 00:07:23.729 real 0m1.896s 00:07:23.729 user 0m1.480s 00:07:23.729 sys 0m0.420s 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.729 13:30:11 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:23.729 ************************************ 00:07:23.729 END TEST accel_cdev_decomp_full 00:07:23.729 ************************************ 00:07:23.729 13:30:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:23.729 13:30:11 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.729 13:30:11 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:23.729 13:30:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.729 13:30:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:23.729 ************************************ 00:07:23.729 START TEST accel_cdev_decomp_mcore 00:07:23.729 ************************************ 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:23.729 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:23.729 [2024-07-15 13:30:11.213138] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:23.729 [2024-07-15 13:30:11.213194] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4146047 ] 00:07:23.729 [2024-07-15 13:30:11.302043] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:23.987 [2024-07-15 13:30:11.391350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.987 [2024-07-15 13:30:11.391439] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:23.987 [2024-07-15 13:30:11.391515] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:23.987 [2024-07-15 13:30:11.391517] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.552 [2024-07-15 13:30:11.947765] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:24.552 [2024-07-15 13:30:11.949728] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e225a0 PMD being used: compress_qat 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 [2024-07-15 13:30:11.954455] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f355019b8b0 PMD being used: compress_qat 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 [2024-07-15 13:30:11.955325] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f354819b8b0 PMD being used: compress_qat 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:24.552 [2024-07-15 13:30:11.955891] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e27880 PMD being used: compress_qat 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 [2024-07-15 13:30:11.956114] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f354019b8b0 PMD being used: compress_qat 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.552 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:24.553 13:30:11 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:25.922 00:07:25.922 real 0m1.945s 00:07:25.922 user 0m6.424s 00:07:25.922 sys 0m0.443s 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:25.922 13:30:13 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:25.922 ************************************ 00:07:25.922 END TEST accel_cdev_decomp_mcore 00:07:25.922 ************************************ 00:07:25.922 13:30:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:25.922 13:30:13 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.922 13:30:13 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:25.922 13:30:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.922 13:30:13 accel -- common/autotest_common.sh@10 -- # set +x 00:07:25.922 ************************************ 00:07:25.922 START TEST accel_cdev_decomp_full_mcore 00:07:25.922 ************************************ 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:25.922 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:25.922 [2024-07-15 13:30:13.242505] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:25.922 [2024-07-15 13:30:13.242562] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4146359 ] 00:07:25.922 [2024-07-15 13:30:13.328604] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:25.922 [2024-07-15 13:30:13.416495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.922 [2024-07-15 13:30:13.416581] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:25.922 [2024-07-15 13:30:13.416660] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:25.922 [2024-07-15 13:30:13.416662] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.488 [2024-07-15 13:30:13.974932] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:26.488 [2024-07-15 13:30:13.976940] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13865a0 PMD being used: compress_qat 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 [2024-07-15 13:30:13.980937] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f24c819b8b0 PMD being used: compress_qat 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 [2024-07-15 13:30:13.981874] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f24c019b8b0 PMD being used: compress_qat 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 [2024-07-15 13:30:13.982503] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1386640 PMD being used: compress_qat 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 [2024-07-15 13:30:13.982720] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f24b819b8b0 PMD being used: compress_qat 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.488 13:30:13 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.860 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:27.861 00:07:27.861 real 0m1.960s 00:07:27.861 user 0m6.461s 00:07:27.861 sys 0m0.459s 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:27.861 13:30:15 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:27.861 ************************************ 00:07:27.861 END TEST accel_cdev_decomp_full_mcore 00:07:27.861 ************************************ 00:07:27.861 13:30:15 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:27.861 13:30:15 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:27.861 13:30:15 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:27.861 13:30:15 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.861 13:30:15 accel -- common/autotest_common.sh@10 -- # set +x 00:07:27.861 ************************************ 00:07:27.861 START TEST accel_cdev_decomp_mthread 00:07:27.861 ************************************ 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:27.861 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:27.861 [2024-07-15 13:30:15.279909] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:27.861 [2024-07-15 13:30:15.279968] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4146607 ] 00:07:27.861 [2024-07-15 13:30:15.367367] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.861 [2024-07-15 13:30:15.451606] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.427 [2024-07-15 13:30:15.997037] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:28.427 [2024-07-15 13:30:15.998951] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1260f00 PMD being used: compress_qat 00:07:28.427 13:30:15 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 [2024-07-15 13:30:16.003225] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1266110 PMD being used: compress_qat 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 [2024-07-15 13:30:16.005139] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1388f70 PMD being used: compress_qat 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:28.427 13:30:16 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:29.799 00:07:29.799 real 0m1.922s 00:07:29.799 user 0m1.484s 00:07:29.799 sys 0m0.428s 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:29.799 13:30:17 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:29.799 ************************************ 00:07:29.799 END TEST accel_cdev_decomp_mthread 00:07:29.799 ************************************ 00:07:29.799 13:30:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:29.799 13:30:17 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.799 13:30:17 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:29.799 13:30:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.799 13:30:17 accel -- common/autotest_common.sh@10 -- # set +x 00:07:29.799 ************************************ 00:07:29.799 START TEST accel_cdev_decomp_full_mthread 00:07:29.799 ************************************ 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:29.799 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:29.799 [2024-07-15 13:30:17.277193] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:29.799 [2024-07-15 13:30:17.277256] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4146928 ] 00:07:29.799 [2024-07-15 13:30:17.362901] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.057 [2024-07-15 13:30:17.448639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.623 [2024-07-15 13:30:17.987946] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:30.623 [2024-07-15 13:30:17.989898] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa8af00 PMD being used: compress_qat 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 [2024-07-15 13:30:17.993443] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa8afa0 PMD being used: compress_qat 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 [2024-07-15 13:30:17.995630] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbb2b80 PMD being used: compress_qat 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.623 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.624 13:30:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:31.555 00:07:31.555 real 0m1.917s 00:07:31.555 user 0m1.484s 00:07:31.555 sys 0m0.427s 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:31.555 13:30:19 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:31.555 ************************************ 00:07:31.555 END TEST accel_cdev_decomp_full_mthread 00:07:31.555 ************************************ 00:07:31.812 13:30:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:31.812 13:30:19 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:07:31.812 13:30:19 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:31.812 13:30:19 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:31.812 13:30:19 accel -- accel/accel.sh@137 -- # build_accel_config 00:07:31.812 13:30:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.812 13:30:19 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.812 13:30:19 accel -- common/autotest_common.sh@10 -- # set +x 00:07:31.812 13:30:19 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.812 13:30:19 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.812 13:30:19 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.812 13:30:19 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:31.812 13:30:19 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:31.812 13:30:19 accel -- accel/accel.sh@41 -- # jq -r . 00:07:31.812 ************************************ 00:07:31.812 START TEST accel_dif_functional_tests 00:07:31.812 ************************************ 00:07:31.812 13:30:19 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:31.812 [2024-07-15 13:30:19.297654] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:31.813 [2024-07-15 13:30:19.297702] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4147154 ] 00:07:31.813 [2024-07-15 13:30:19.383782] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:32.070 [2024-07-15 13:30:19.470463] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:32.070 [2024-07-15 13:30:19.470551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:32.070 [2024-07-15 13:30:19.470553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.070 00:07:32.070 00:07:32.070 CUnit - A unit testing framework for C - Version 2.1-3 00:07:32.070 http://cunit.sourceforge.net/ 00:07:32.070 00:07:32.070 00:07:32.070 Suite: accel_dif 00:07:32.070 Test: verify: DIF generated, GUARD check ...passed 00:07:32.070 Test: verify: DIF generated, APPTAG check ...passed 00:07:32.070 Test: verify: DIF generated, REFTAG check ...passed 00:07:32.070 Test: verify: DIF not generated, GUARD check ...[2024-07-15 13:30:19.574605] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:32.070 passed 00:07:32.070 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 13:30:19.574664] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:32.070 passed 00:07:32.070 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 13:30:19.574702] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:32.070 passed 00:07:32.070 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:32.070 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 13:30:19.574752] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:32.070 passed 00:07:32.070 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:32.070 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:32.070 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:32.070 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 13:30:19.574857] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:32.070 passed 00:07:32.070 Test: verify copy: DIF generated, GUARD check ...passed 00:07:32.070 Test: verify copy: DIF generated, APPTAG check ...passed 00:07:32.070 Test: verify copy: DIF generated, REFTAG check ...passed 00:07:32.070 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 13:30:19.574978] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:32.070 passed 00:07:32.070 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 13:30:19.575009] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:32.070 passed 00:07:32.070 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 13:30:19.575032] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:32.070 passed 00:07:32.070 Test: generate copy: DIF generated, GUARD check ...passed 00:07:32.070 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:32.070 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:32.070 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:32.070 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:32.070 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:32.070 Test: generate copy: iovecs-len validate ...[2024-07-15 13:30:19.575203] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:32.070 passed 00:07:32.070 Test: generate copy: buffer alignment validate ...passed 00:07:32.070 00:07:32.070 Run Summary: Type Total Ran Passed Failed Inactive 00:07:32.070 suites 1 1 n/a 0 0 00:07:32.070 tests 26 26 26 0 0 00:07:32.070 asserts 115 115 115 0 n/a 00:07:32.070 00:07:32.070 Elapsed time = 0.002 seconds 00:07:32.328 00:07:32.328 real 0m0.535s 00:07:32.328 user 0m0.775s 00:07:32.328 sys 0m0.194s 00:07:32.328 13:30:19 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.328 13:30:19 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:07:32.328 ************************************ 00:07:32.328 END TEST accel_dif_functional_tests 00:07:32.328 ************************************ 00:07:32.328 13:30:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:32.328 00:07:32.328 real 0m49.063s 00:07:32.328 user 0m57.740s 00:07:32.328 sys 0m9.746s 00:07:32.328 13:30:19 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.328 13:30:19 accel -- common/autotest_common.sh@10 -- # set +x 00:07:32.328 ************************************ 00:07:32.328 END TEST accel 00:07:32.328 ************************************ 00:07:32.328 13:30:19 -- common/autotest_common.sh@1142 -- # return 0 00:07:32.328 13:30:19 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:32.328 13:30:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:32.328 13:30:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.328 13:30:19 -- common/autotest_common.sh@10 -- # set +x 00:07:32.328 ************************************ 00:07:32.328 START TEST accel_rpc 00:07:32.328 ************************************ 00:07:32.328 13:30:19 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:32.586 * Looking for test storage... 00:07:32.586 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:32.586 13:30:20 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:32.586 13:30:20 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=4147355 00:07:32.586 13:30:20 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 4147355 00:07:32.586 13:30:20 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:32.586 13:30:20 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 4147355 ']' 00:07:32.586 13:30:20 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.586 13:30:20 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:32.586 13:30:20 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.586 13:30:20 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:32.586 13:30:20 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:32.586 [2024-07-15 13:30:20.079610] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:32.586 [2024-07-15 13:30:20.079668] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4147355 ] 00:07:32.586 [2024-07-15 13:30:20.166407] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.843 [2024-07-15 13:30:20.255404] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.408 13:30:20 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:33.408 13:30:20 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:33.408 13:30:20 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:33.408 13:30:20 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:33.408 13:30:20 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:33.408 13:30:20 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:33.408 13:30:20 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:33.408 13:30:20 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:33.408 13:30:20 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.408 13:30:20 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:33.408 ************************************ 00:07:33.408 START TEST accel_assign_opcode 00:07:33.408 ************************************ 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:33.408 [2024-07-15 13:30:20.913399] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:33.408 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:33.408 [2024-07-15 13:30:20.921404] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:33.409 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:33.409 13:30:20 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:33.409 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:33.409 13:30:20 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:33.667 software 00:07:33.667 00:07:33.667 real 0m0.277s 00:07:33.667 user 0m0.052s 00:07:33.667 sys 0m0.010s 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.667 13:30:21 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:33.667 ************************************ 00:07:33.667 END TEST accel_assign_opcode 00:07:33.667 ************************************ 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:33.667 13:30:21 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 4147355 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 4147355 ']' 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 4147355 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4147355 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4147355' 00:07:33.667 killing process with pid 4147355 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@967 -- # kill 4147355 00:07:33.667 13:30:21 accel_rpc -- common/autotest_common.sh@972 -- # wait 4147355 00:07:34.232 00:07:34.232 real 0m1.722s 00:07:34.232 user 0m1.732s 00:07:34.232 sys 0m0.507s 00:07:34.232 13:30:21 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:34.232 13:30:21 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.232 ************************************ 00:07:34.232 END TEST accel_rpc 00:07:34.232 ************************************ 00:07:34.232 13:30:21 -- common/autotest_common.sh@1142 -- # return 0 00:07:34.232 13:30:21 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:34.232 13:30:21 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:34.232 13:30:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.232 13:30:21 -- common/autotest_common.sh@10 -- # set +x 00:07:34.232 ************************************ 00:07:34.232 START TEST app_cmdline 00:07:34.232 ************************************ 00:07:34.232 13:30:21 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:34.232 * Looking for test storage... 00:07:34.232 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:34.232 13:30:21 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:34.232 13:30:21 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=4147614 00:07:34.232 13:30:21 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 4147614 00:07:34.232 13:30:21 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:34.232 13:30:21 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 4147614 ']' 00:07:34.232 13:30:21 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.232 13:30:21 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:34.232 13:30:21 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.232 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.232 13:30:21 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:34.232 13:30:21 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:34.232 [2024-07-15 13:30:21.848240] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:34.232 [2024-07-15 13:30:21.848295] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4147614 ] 00:07:34.490 [2024-07-15 13:30:21.934490] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.490 [2024-07-15 13:30:22.016035] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.055 13:30:22 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:35.055 13:30:22 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:07:35.055 13:30:22 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:35.314 { 00:07:35.314 "version": "SPDK v24.09-pre git sha1 9cede6267", 00:07:35.314 "fields": { 00:07:35.314 "major": 24, 00:07:35.314 "minor": 9, 00:07:35.314 "patch": 0, 00:07:35.314 "suffix": "-pre", 00:07:35.314 "commit": "9cede6267" 00:07:35.314 } 00:07:35.314 } 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:35.314 13:30:22 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:07:35.314 13:30:22 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.572 request: 00:07:35.572 { 00:07:35.572 "method": "env_dpdk_get_mem_stats", 00:07:35.572 "req_id": 1 00:07:35.572 } 00:07:35.572 Got JSON-RPC error response 00:07:35.572 response: 00:07:35.572 { 00:07:35.572 "code": -32601, 00:07:35.572 "message": "Method not found" 00:07:35.572 } 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:35.572 13:30:23 app_cmdline -- app/cmdline.sh@1 -- # killprocess 4147614 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 4147614 ']' 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 4147614 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4147614 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4147614' 00:07:35.572 killing process with pid 4147614 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@967 -- # kill 4147614 00:07:35.572 13:30:23 app_cmdline -- common/autotest_common.sh@972 -- # wait 4147614 00:07:35.832 00:07:35.833 real 0m1.716s 00:07:35.833 user 0m1.960s 00:07:35.833 sys 0m0.503s 00:07:35.833 13:30:23 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.833 13:30:23 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:35.833 ************************************ 00:07:35.833 END TEST app_cmdline 00:07:35.833 ************************************ 00:07:36.091 13:30:23 -- common/autotest_common.sh@1142 -- # return 0 00:07:36.091 13:30:23 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:36.091 13:30:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:36.091 13:30:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.091 13:30:23 -- common/autotest_common.sh@10 -- # set +x 00:07:36.091 ************************************ 00:07:36.091 START TEST version 00:07:36.091 ************************************ 00:07:36.091 13:30:23 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:36.091 * Looking for test storage... 00:07:36.091 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:36.091 13:30:23 version -- app/version.sh@17 -- # get_header_version major 00:07:36.091 13:30:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # cut -f2 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.091 13:30:23 version -- app/version.sh@17 -- # major=24 00:07:36.091 13:30:23 version -- app/version.sh@18 -- # get_header_version minor 00:07:36.091 13:30:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # cut -f2 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.091 13:30:23 version -- app/version.sh@18 -- # minor=9 00:07:36.091 13:30:23 version -- app/version.sh@19 -- # get_header_version patch 00:07:36.091 13:30:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # cut -f2 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.091 13:30:23 version -- app/version.sh@19 -- # patch=0 00:07:36.091 13:30:23 version -- app/version.sh@20 -- # get_header_version suffix 00:07:36.091 13:30:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # cut -f2 00:07:36.091 13:30:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.091 13:30:23 version -- app/version.sh@20 -- # suffix=-pre 00:07:36.091 13:30:23 version -- app/version.sh@22 -- # version=24.9 00:07:36.091 13:30:23 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:36.091 13:30:23 version -- app/version.sh@28 -- # version=24.9rc0 00:07:36.091 13:30:23 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:07:36.091 13:30:23 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:36.091 13:30:23 version -- app/version.sh@30 -- # py_version=24.9rc0 00:07:36.091 13:30:23 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:07:36.091 00:07:36.091 real 0m0.169s 00:07:36.091 user 0m0.080s 00:07:36.091 sys 0m0.137s 00:07:36.091 13:30:23 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.091 13:30:23 version -- common/autotest_common.sh@10 -- # set +x 00:07:36.091 ************************************ 00:07:36.091 END TEST version 00:07:36.091 ************************************ 00:07:36.349 13:30:23 -- common/autotest_common.sh@1142 -- # return 0 00:07:36.349 13:30:23 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:07:36.349 13:30:23 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:36.349 13:30:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:36.349 13:30:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.349 13:30:23 -- common/autotest_common.sh@10 -- # set +x 00:07:36.349 ************************************ 00:07:36.349 START TEST blockdev_general 00:07:36.349 ************************************ 00:07:36.349 13:30:23 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:36.349 * Looking for test storage... 00:07:36.349 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:36.349 13:30:23 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:07:36.349 13:30:23 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=4148077 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:07:36.350 13:30:23 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 4148077 00:07:36.350 13:30:23 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 4148077 ']' 00:07:36.350 13:30:23 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:36.350 13:30:23 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:36.350 13:30:23 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:36.350 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:36.350 13:30:23 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:36.350 13:30:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:36.350 [2024-07-15 13:30:23.940811] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:36.350 [2024-07-15 13:30:23.940872] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4148077 ] 00:07:36.608 [2024-07-15 13:30:24.025168] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.608 [2024-07-15 13:30:24.112164] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.173 13:30:24 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:37.173 13:30:24 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:07:37.173 13:30:24 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:07:37.173 13:30:24 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:07:37.173 13:30:24 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:07:37.173 13:30:24 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.173 13:30:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.430 [2024-07-15 13:30:24.983183] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:37.430 [2024-07-15 13:30:24.983231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:37.430 00:07:37.430 [2024-07-15 13:30:24.991170] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:37.430 [2024-07-15 13:30:24.991188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:37.430 00:07:37.430 Malloc0 00:07:37.430 Malloc1 00:07:37.430 Malloc2 00:07:37.688 Malloc3 00:07:37.688 Malloc4 00:07:37.688 Malloc5 00:07:37.688 Malloc6 00:07:37.688 Malloc7 00:07:37.688 Malloc8 00:07:37.688 Malloc9 00:07:37.688 [2024-07-15 13:30:25.130179] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:37.688 [2024-07-15 13:30:25.130221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:37.688 [2024-07-15 13:30:25.130237] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a0820 00:07:37.688 [2024-07-15 13:30:25.130245] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:37.688 [2024-07-15 13:30:25.131192] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:37.688 [2024-07-15 13:30:25.131214] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:37.688 TestPT 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.688 13:30:25 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:07:37.688 5000+0 records in 00:07:37.688 5000+0 records out 00:07:37.688 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0264299 s, 387 MB/s 00:07:37.688 13:30:25 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.688 AIO0 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.688 13:30:25 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.688 13:30:25 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:07:37.688 13:30:25 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.688 13:30:25 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.688 13:30:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.945 13:30:25 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.946 13:30:25 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:07:37.946 13:30:25 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.946 13:30:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.946 13:30:25 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.946 13:30:25 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:07:37.946 13:30:25 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:07:37.946 13:30:25 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:07:37.946 13:30:25 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.946 13:30:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.946 13:30:25 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.946 13:30:25 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:07:37.946 13:30:25 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:07:37.947 13:30:25 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "48aa72e9-5f15-4c80-a523-7f7dfaf32119"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "48aa72e9-5f15-4c80-a523-7f7dfaf32119",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e21b7357-ffe2-5fdc-b094-4d118ac5ad60"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e21b7357-ffe2-5fdc-b094-4d118ac5ad60",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "35f3a3da-62c6-566e-b025-6d3e8a7d1154"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "35f3a3da-62c6-566e-b025-6d3e8a7d1154",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "363685c1-16d0-51be-a5e2-2b7e8208775a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "363685c1-16d0-51be-a5e2-2b7e8208775a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "b3a2c8bd-5209-545c-95a7-8acf29cd4c78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b3a2c8bd-5209-545c-95a7-8acf29cd4c78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b0e2e76d-7c52-5ddb-8294-12ef7ae1b6f5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b0e2e76d-7c52-5ddb-8294-12ef7ae1b6f5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "eb774bb3-c666-5cf2-b8e1-f22fa2700420"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "eb774bb3-c666-5cf2-b8e1-f22fa2700420",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "67b72e79-c0f8-5115-a477-c17b8f81a4e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67b72e79-c0f8-5115-a477-c17b8f81a4e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "c91e2198-f4d6-5eac-a830-916aca99aff2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c91e2198-f4d6-5eac-a830-916aca99aff2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c80adbba-b879-55e2-ac1e-429ce972f83e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c80adbba-b879-55e2-ac1e-429ce972f83e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e265e266-c67d-581e-a123-a6d59d2e4feb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e265e266-c67d-581e-a123-a6d59d2e4feb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "5209a344-f469-546a-aa61-d17b62e961e8"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5209a344-f469-546a-aa61-d17b62e961e8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "4c5cebaa-e3ef-4f89-8387-dfd86af2a668"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4c5cebaa-e3ef-4f89-8387-dfd86af2a668",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4c5cebaa-e3ef-4f89-8387-dfd86af2a668",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5547bc2d-e462-4cc2-855e-12666afbfc68",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "579f7b31-181b-4f86-b409-9aff863d20e0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "6531a738-0867-45d8-8c5a-29898c4ce58d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "6531a738-0867-45d8-8c5a-29898c4ce58d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "6531a738-0867-45d8-8c5a-29898c4ce58d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "5fb74f07-b8a8-4982-80a5-1caa213f221c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "9b41b728-fd4e-44ae-9f3f-cb45a43c6e5f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "8c8c3bf6-3c02-450b-96b5-57fa82f54c4f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "a0d722c7-a440-424c-8ac0-5e285abb118d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "f5c49728-c43b-4df9-835d-d6301e7bccc4"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "f5c49728-c43b-4df9-835d-d6301e7bccc4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:37.947 13:30:25 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:07:37.947 13:30:25 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:07:37.947 13:30:25 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:07:37.947 13:30:25 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 4148077 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 4148077 ']' 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 4148077 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4148077 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4148077' 00:07:37.947 killing process with pid 4148077 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@967 -- # kill 4148077 00:07:37.947 13:30:25 blockdev_general -- common/autotest_common.sh@972 -- # wait 4148077 00:07:38.510 13:30:26 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:38.510 13:30:26 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:38.510 13:30:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:38.510 13:30:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.510 13:30:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.510 ************************************ 00:07:38.510 START TEST bdev_hello_world 00:07:38.510 ************************************ 00:07:38.510 13:30:26 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:38.510 [2024-07-15 13:30:26.102714] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:38.510 [2024-07-15 13:30:26.102758] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4148340 ] 00:07:38.766 [2024-07-15 13:30:26.190211] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.766 [2024-07-15 13:30:26.275855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.023 [2024-07-15 13:30:26.423639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:39.023 [2024-07-15 13:30:26.423688] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:39.023 [2024-07-15 13:30:26.423697] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:39.023 [2024-07-15 13:30:26.431637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:39.023 [2024-07-15 13:30:26.431655] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:39.023 [2024-07-15 13:30:26.439649] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:39.023 [2024-07-15 13:30:26.439664] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:39.023 [2024-07-15 13:30:26.512571] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:39.023 [2024-07-15 13:30:26.512613] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:39.023 [2024-07-15 13:30:26.512627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df1270 00:07:39.023 [2024-07-15 13:30:26.512635] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:39.023 [2024-07-15 13:30:26.513618] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:39.023 [2024-07-15 13:30:26.513640] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:39.278 [2024-07-15 13:30:26.646301] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:07:39.278 [2024-07-15 13:30:26.646338] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:07:39.278 [2024-07-15 13:30:26.646364] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:07:39.278 [2024-07-15 13:30:26.646401] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:07:39.278 [2024-07-15 13:30:26.646437] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:07:39.278 [2024-07-15 13:30:26.646451] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:07:39.278 [2024-07-15 13:30:26.646480] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:07:39.278 00:07:39.278 [2024-07-15 13:30:26.646497] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:07:39.533 00:07:39.533 real 0m0.872s 00:07:39.533 user 0m0.579s 00:07:39.533 sys 0m0.262s 00:07:39.533 13:30:26 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:39.533 13:30:26 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:39.533 ************************************ 00:07:39.533 END TEST bdev_hello_world 00:07:39.533 ************************************ 00:07:39.533 13:30:26 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:07:39.533 13:30:26 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:07:39.533 13:30:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:39.533 13:30:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.533 13:30:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:39.533 ************************************ 00:07:39.533 START TEST bdev_bounds 00:07:39.533 ************************************ 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=4148481 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 4148481' 00:07:39.533 Process bdevio pid: 4148481 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 4148481 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 4148481 ']' 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:39.533 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:39.533 13:30:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:39.533 [2024-07-15 13:30:27.068490] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:39.533 [2024-07-15 13:30:27.068536] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4148481 ] 00:07:39.789 [2024-07-15 13:30:27.157436] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:39.789 [2024-07-15 13:30:27.249824] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:39.789 [2024-07-15 13:30:27.249914] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:39.789 [2024-07-15 13:30:27.249917] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.789 [2024-07-15 13:30:27.397777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:39.789 [2024-07-15 13:30:27.397825] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:39.789 [2024-07-15 13:30:27.397854] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:39.789 [2024-07-15 13:30:27.405799] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:39.789 [2024-07-15 13:30:27.405824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:40.045 [2024-07-15 13:30:27.413809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:40.045 [2024-07-15 13:30:27.413831] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:40.045 [2024-07-15 13:30:27.486244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:40.045 [2024-07-15 13:30:27.486287] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:40.045 [2024-07-15 13:30:27.486299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x275e4c0 00:07:40.045 [2024-07-15 13:30:27.486324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:40.045 [2024-07-15 13:30:27.487556] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:40.045 [2024-07-15 13:30:27.487580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:40.301 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:40.301 13:30:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:07:40.301 13:30:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:07:40.559 I/O targets: 00:07:40.559 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:07:40.559 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:07:40.559 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:07:40.559 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:07:40.559 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:07:40.559 raid0: 131072 blocks of 512 bytes (64 MiB) 00:07:40.559 concat0: 131072 blocks of 512 bytes (64 MiB) 00:07:40.559 raid1: 65536 blocks of 512 bytes (32 MiB) 00:07:40.559 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:07:40.559 00:07:40.559 00:07:40.559 CUnit - A unit testing framework for C - Version 2.1-3 00:07:40.559 http://cunit.sourceforge.net/ 00:07:40.559 00:07:40.559 00:07:40.559 Suite: bdevio tests on: AIO0 00:07:40.559 Test: blockdev write read block ...passed 00:07:40.559 Test: blockdev write zeroes read block ...passed 00:07:40.559 Test: blockdev write zeroes read no split ...passed 00:07:40.559 Test: blockdev write zeroes read split ...passed 00:07:40.559 Test: blockdev write zeroes read split partial ...passed 00:07:40.559 Test: blockdev reset ...passed 00:07:40.559 Test: blockdev write read 8 blocks ...passed 00:07:40.559 Test: blockdev write read size > 128k ...passed 00:07:40.559 Test: blockdev write read invalid size ...passed 00:07:40.559 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.559 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.559 Test: blockdev write read max offset ...passed 00:07:40.559 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.559 Test: blockdev writev readv 8 blocks ...passed 00:07:40.559 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.559 Test: blockdev writev readv block ...passed 00:07:40.559 Test: blockdev writev readv size > 128k ...passed 00:07:40.559 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.559 Test: blockdev comparev and writev ...passed 00:07:40.559 Test: blockdev nvme passthru rw ...passed 00:07:40.559 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.559 Test: blockdev nvme admin passthru ...passed 00:07:40.559 Test: blockdev copy ...passed 00:07:40.559 Suite: bdevio tests on: raid1 00:07:40.559 Test: blockdev write read block ...passed 00:07:40.559 Test: blockdev write zeroes read block ...passed 00:07:40.559 Test: blockdev write zeroes read no split ...passed 00:07:40.559 Test: blockdev write zeroes read split ...passed 00:07:40.559 Test: blockdev write zeroes read split partial ...passed 00:07:40.559 Test: blockdev reset ...passed 00:07:40.559 Test: blockdev write read 8 blocks ...passed 00:07:40.559 Test: blockdev write read size > 128k ...passed 00:07:40.559 Test: blockdev write read invalid size ...passed 00:07:40.559 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.559 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.559 Test: blockdev write read max offset ...passed 00:07:40.559 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.559 Test: blockdev writev readv 8 blocks ...passed 00:07:40.559 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.559 Test: blockdev writev readv block ...passed 00:07:40.559 Test: blockdev writev readv size > 128k ...passed 00:07:40.559 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.559 Test: blockdev comparev and writev ...passed 00:07:40.559 Test: blockdev nvme passthru rw ...passed 00:07:40.559 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.559 Test: blockdev nvme admin passthru ...passed 00:07:40.559 Test: blockdev copy ...passed 00:07:40.559 Suite: bdevio tests on: concat0 00:07:40.559 Test: blockdev write read block ...passed 00:07:40.559 Test: blockdev write zeroes read block ...passed 00:07:40.559 Test: blockdev write zeroes read no split ...passed 00:07:40.559 Test: blockdev write zeroes read split ...passed 00:07:40.559 Test: blockdev write zeroes read split partial ...passed 00:07:40.559 Test: blockdev reset ...passed 00:07:40.559 Test: blockdev write read 8 blocks ...passed 00:07:40.559 Test: blockdev write read size > 128k ...passed 00:07:40.559 Test: blockdev write read invalid size ...passed 00:07:40.559 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.559 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.559 Test: blockdev write read max offset ...passed 00:07:40.559 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.559 Test: blockdev writev readv 8 blocks ...passed 00:07:40.559 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.559 Test: blockdev writev readv block ...passed 00:07:40.559 Test: blockdev writev readv size > 128k ...passed 00:07:40.559 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.559 Test: blockdev comparev and writev ...passed 00:07:40.559 Test: blockdev nvme passthru rw ...passed 00:07:40.559 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.559 Test: blockdev nvme admin passthru ...passed 00:07:40.559 Test: blockdev copy ...passed 00:07:40.559 Suite: bdevio tests on: raid0 00:07:40.559 Test: blockdev write read block ...passed 00:07:40.559 Test: blockdev write zeroes read block ...passed 00:07:40.559 Test: blockdev write zeroes read no split ...passed 00:07:40.559 Test: blockdev write zeroes read split ...passed 00:07:40.559 Test: blockdev write zeroes read split partial ...passed 00:07:40.559 Test: blockdev reset ...passed 00:07:40.559 Test: blockdev write read 8 blocks ...passed 00:07:40.559 Test: blockdev write read size > 128k ...passed 00:07:40.559 Test: blockdev write read invalid size ...passed 00:07:40.559 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.559 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.559 Test: blockdev write read max offset ...passed 00:07:40.559 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.559 Test: blockdev writev readv 8 blocks ...passed 00:07:40.559 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.559 Test: blockdev writev readv block ...passed 00:07:40.559 Test: blockdev writev readv size > 128k ...passed 00:07:40.559 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.559 Test: blockdev comparev and writev ...passed 00:07:40.559 Test: blockdev nvme passthru rw ...passed 00:07:40.559 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.559 Test: blockdev nvme admin passthru ...passed 00:07:40.559 Test: blockdev copy ...passed 00:07:40.559 Suite: bdevio tests on: TestPT 00:07:40.559 Test: blockdev write read block ...passed 00:07:40.559 Test: blockdev write zeroes read block ...passed 00:07:40.559 Test: blockdev write zeroes read no split ...passed 00:07:40.559 Test: blockdev write zeroes read split ...passed 00:07:40.559 Test: blockdev write zeroes read split partial ...passed 00:07:40.559 Test: blockdev reset ...passed 00:07:40.559 Test: blockdev write read 8 blocks ...passed 00:07:40.559 Test: blockdev write read size > 128k ...passed 00:07:40.559 Test: blockdev write read invalid size ...passed 00:07:40.559 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.559 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.559 Test: blockdev write read max offset ...passed 00:07:40.559 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.559 Test: blockdev writev readv 8 blocks ...passed 00:07:40.559 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.559 Test: blockdev writev readv block ...passed 00:07:40.559 Test: blockdev writev readv size > 128k ...passed 00:07:40.559 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.559 Test: blockdev comparev and writev ...passed 00:07:40.559 Test: blockdev nvme passthru rw ...passed 00:07:40.559 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.559 Test: blockdev nvme admin passthru ...passed 00:07:40.559 Test: blockdev copy ...passed 00:07:40.559 Suite: bdevio tests on: Malloc2p7 00:07:40.559 Test: blockdev write read block ...passed 00:07:40.559 Test: blockdev write zeroes read block ...passed 00:07:40.559 Test: blockdev write zeroes read no split ...passed 00:07:40.559 Test: blockdev write zeroes read split ...passed 00:07:40.559 Test: blockdev write zeroes read split partial ...passed 00:07:40.559 Test: blockdev reset ...passed 00:07:40.559 Test: blockdev write read 8 blocks ...passed 00:07:40.559 Test: blockdev write read size > 128k ...passed 00:07:40.559 Test: blockdev write read invalid size ...passed 00:07:40.559 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.559 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.559 Test: blockdev write read max offset ...passed 00:07:40.559 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.559 Test: blockdev writev readv 8 blocks ...passed 00:07:40.559 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.559 Test: blockdev writev readv block ...passed 00:07:40.559 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc2p6 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.560 Test: blockdev writev readv block ...passed 00:07:40.560 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc2p5 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.560 Test: blockdev writev readv block ...passed 00:07:40.560 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc2p4 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.560 Test: blockdev writev readv block ...passed 00:07:40.560 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc2p3 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.560 Test: blockdev writev readv block ...passed 00:07:40.560 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc2p2 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.560 Test: blockdev writev readv block ...passed 00:07:40.560 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc2p1 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.560 Test: blockdev writev readv block ...passed 00:07:40.560 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc2p0 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.560 Test: blockdev writev readv block ...passed 00:07:40.560 Test: blockdev writev readv size > 128k ...passed 00:07:40.560 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.560 Test: blockdev comparev and writev ...passed 00:07:40.560 Test: blockdev nvme passthru rw ...passed 00:07:40.560 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.560 Test: blockdev nvme admin passthru ...passed 00:07:40.560 Test: blockdev copy ...passed 00:07:40.560 Suite: bdevio tests on: Malloc1p1 00:07:40.560 Test: blockdev write read block ...passed 00:07:40.560 Test: blockdev write zeroes read block ...passed 00:07:40.560 Test: blockdev write zeroes read no split ...passed 00:07:40.560 Test: blockdev write zeroes read split ...passed 00:07:40.560 Test: blockdev write zeroes read split partial ...passed 00:07:40.560 Test: blockdev reset ...passed 00:07:40.560 Test: blockdev write read 8 blocks ...passed 00:07:40.560 Test: blockdev write read size > 128k ...passed 00:07:40.560 Test: blockdev write read invalid size ...passed 00:07:40.560 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.560 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.560 Test: blockdev write read max offset ...passed 00:07:40.560 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.560 Test: blockdev writev readv 8 blocks ...passed 00:07:40.560 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.561 Test: blockdev writev readv block ...passed 00:07:40.561 Test: blockdev writev readv size > 128k ...passed 00:07:40.561 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.561 Test: blockdev comparev and writev ...passed 00:07:40.561 Test: blockdev nvme passthru rw ...passed 00:07:40.561 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.561 Test: blockdev nvme admin passthru ...passed 00:07:40.561 Test: blockdev copy ...passed 00:07:40.561 Suite: bdevio tests on: Malloc1p0 00:07:40.561 Test: blockdev write read block ...passed 00:07:40.561 Test: blockdev write zeroes read block ...passed 00:07:40.561 Test: blockdev write zeroes read no split ...passed 00:07:40.561 Test: blockdev write zeroes read split ...passed 00:07:40.561 Test: blockdev write zeroes read split partial ...passed 00:07:40.561 Test: blockdev reset ...passed 00:07:40.561 Test: blockdev write read 8 blocks ...passed 00:07:40.561 Test: blockdev write read size > 128k ...passed 00:07:40.561 Test: blockdev write read invalid size ...passed 00:07:40.561 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.561 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.561 Test: blockdev write read max offset ...passed 00:07:40.561 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.561 Test: blockdev writev readv 8 blocks ...passed 00:07:40.561 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.561 Test: blockdev writev readv block ...passed 00:07:40.561 Test: blockdev writev readv size > 128k ...passed 00:07:40.561 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.561 Test: blockdev comparev and writev ...passed 00:07:40.561 Test: blockdev nvme passthru rw ...passed 00:07:40.561 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.561 Test: blockdev nvme admin passthru ...passed 00:07:40.561 Test: blockdev copy ...passed 00:07:40.561 Suite: bdevio tests on: Malloc0 00:07:40.561 Test: blockdev write read block ...passed 00:07:40.561 Test: blockdev write zeroes read block ...passed 00:07:40.561 Test: blockdev write zeroes read no split ...passed 00:07:40.561 Test: blockdev write zeroes read split ...passed 00:07:40.561 Test: blockdev write zeroes read split partial ...passed 00:07:40.561 Test: blockdev reset ...passed 00:07:40.561 Test: blockdev write read 8 blocks ...passed 00:07:40.561 Test: blockdev write read size > 128k ...passed 00:07:40.561 Test: blockdev write read invalid size ...passed 00:07:40.561 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:40.561 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:40.561 Test: blockdev write read max offset ...passed 00:07:40.561 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:40.561 Test: blockdev writev readv 8 blocks ...passed 00:07:40.561 Test: blockdev writev readv 30 x 1block ...passed 00:07:40.561 Test: blockdev writev readv block ...passed 00:07:40.561 Test: blockdev writev readv size > 128k ...passed 00:07:40.561 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:40.561 Test: blockdev comparev and writev ...passed 00:07:40.561 Test: blockdev nvme passthru rw ...passed 00:07:40.561 Test: blockdev nvme passthru vendor specific ...passed 00:07:40.561 Test: blockdev nvme admin passthru ...passed 00:07:40.561 Test: blockdev copy ...passed 00:07:40.561 00:07:40.561 Run Summary: Type Total Ran Passed Failed Inactive 00:07:40.561 suites 16 16 n/a 0 0 00:07:40.561 tests 368 368 368 0 0 00:07:40.561 asserts 2224 2224 2224 0 n/a 00:07:40.561 00:07:40.561 Elapsed time = 0.472 seconds 00:07:40.818 0 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 4148481 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 4148481 ']' 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 4148481 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4148481 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4148481' 00:07:40.818 killing process with pid 4148481 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 4148481 00:07:40.818 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 4148481 00:07:41.076 13:30:28 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:07:41.076 00:07:41.076 real 0m1.520s 00:07:41.076 user 0m3.736s 00:07:41.076 sys 0m0.425s 00:07:41.076 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.076 13:30:28 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:41.076 ************************************ 00:07:41.076 END TEST bdev_bounds 00:07:41.076 ************************************ 00:07:41.076 13:30:28 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:07:41.077 13:30:28 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:41.077 13:30:28 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:07:41.077 13:30:28 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.077 13:30:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:41.077 ************************************ 00:07:41.077 START TEST bdev_nbd 00:07:41.077 ************************************ 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=4148695 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 4148695 /var/tmp/spdk-nbd.sock 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 4148695 ']' 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:41.077 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:41.077 13:30:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:41.077 [2024-07-15 13:30:28.684433] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:07:41.077 [2024-07-15 13:30:28.684491] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:41.335 [2024-07-15 13:30:28.778948] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.335 [2024-07-15 13:30:28.872509] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.593 [2024-07-15 13:30:29.027386] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:41.593 [2024-07-15 13:30:29.027430] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:41.593 [2024-07-15 13:30:29.027440] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:41.593 [2024-07-15 13:30:29.035395] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:41.593 [2024-07-15 13:30:29.035412] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:41.593 [2024-07-15 13:30:29.043411] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:41.593 [2024-07-15 13:30:29.043427] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:41.593 [2024-07-15 13:30:29.111896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:41.593 [2024-07-15 13:30:29.111937] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:41.593 [2024-07-15 13:30:29.111950] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21f8ff0 00:07:41.593 [2024-07-15 13:30:29.111974] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:41.593 [2024-07-15 13:30:29.113054] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:41.593 [2024-07-15 13:30:29.113080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:42.156 1+0 records in 00:07:42.156 1+0 records out 00:07:42.156 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261411 s, 15.7 MB/s 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:42.156 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:07:42.480 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:07:42.480 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:42.481 1+0 records in 00:07:42.481 1+0 records out 00:07:42.481 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00018762 s, 21.8 MB/s 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:42.481 13:30:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:42.761 1+0 records in 00:07:42.761 1+0 records out 00:07:42.761 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000188 s, 21.8 MB/s 00:07:42.761 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:42.762 1+0 records in 00:07:42.762 1+0 records out 00:07:42.762 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289074 s, 14.2 MB/s 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:42.762 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.018 1+0 records in 00:07:43.018 1+0 records out 00:07:43.018 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314316 s, 13.0 MB/s 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.018 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.019 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.019 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.019 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.019 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.019 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.019 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.275 1+0 records in 00:07:43.275 1+0 records out 00:07:43.275 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280984 s, 14.6 MB/s 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.275 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.532 13:30:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.532 1+0 records in 00:07:43.532 1+0 records out 00:07:43.532 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000356029 s, 11.5 MB/s 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.532 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.789 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.790 1+0 records in 00:07:43.790 1+0 records out 00:07:43.790 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000379528 s, 10.8 MB/s 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.790 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.047 1+0 records in 00:07:44.047 1+0 records out 00:07:44.047 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000377432 s, 10.9 MB/s 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:07:44.047 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.304 1+0 records in 00:07:44.304 1+0 records out 00:07:44.304 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00045416 s, 9.0 MB/s 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.304 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.305 1+0 records in 00:07:44.305 1+0 records out 00:07:44.305 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000474777 s, 8.6 MB/s 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.305 13:30:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.562 1+0 records in 00:07:44.562 1+0 records out 00:07:44.562 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000479549 s, 8.5 MB/s 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.562 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.820 1+0 records in 00:07:44.820 1+0 records out 00:07:44.820 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000398889 s, 10.3 MB/s 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.820 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.077 1+0 records in 00:07:45.077 1+0 records out 00:07:45.077 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000514786 s, 8.0 MB/s 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.077 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.335 1+0 records in 00:07:45.335 1+0 records out 00:07:45.335 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000585431 s, 7.0 MB/s 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.335 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.336 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.336 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.336 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.593 1+0 records in 00:07:45.593 1+0 records out 00:07:45.593 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000520841 s, 7.9 MB/s 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.593 13:30:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd0", 00:07:45.593 "bdev_name": "Malloc0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd1", 00:07:45.593 "bdev_name": "Malloc1p0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd2", 00:07:45.593 "bdev_name": "Malloc1p1" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd3", 00:07:45.593 "bdev_name": "Malloc2p0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd4", 00:07:45.593 "bdev_name": "Malloc2p1" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd5", 00:07:45.593 "bdev_name": "Malloc2p2" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd6", 00:07:45.593 "bdev_name": "Malloc2p3" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd7", 00:07:45.593 "bdev_name": "Malloc2p4" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd8", 00:07:45.593 "bdev_name": "Malloc2p5" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd9", 00:07:45.593 "bdev_name": "Malloc2p6" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd10", 00:07:45.593 "bdev_name": "Malloc2p7" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd11", 00:07:45.593 "bdev_name": "TestPT" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd12", 00:07:45.593 "bdev_name": "raid0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd13", 00:07:45.593 "bdev_name": "concat0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd14", 00:07:45.593 "bdev_name": "raid1" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd15", 00:07:45.593 "bdev_name": "AIO0" 00:07:45.593 } 00:07:45.593 ]' 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd0", 00:07:45.593 "bdev_name": "Malloc0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd1", 00:07:45.593 "bdev_name": "Malloc1p0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd2", 00:07:45.593 "bdev_name": "Malloc1p1" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd3", 00:07:45.593 "bdev_name": "Malloc2p0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd4", 00:07:45.593 "bdev_name": "Malloc2p1" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd5", 00:07:45.593 "bdev_name": "Malloc2p2" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd6", 00:07:45.593 "bdev_name": "Malloc2p3" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd7", 00:07:45.593 "bdev_name": "Malloc2p4" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd8", 00:07:45.593 "bdev_name": "Malloc2p5" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd9", 00:07:45.593 "bdev_name": "Malloc2p6" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd10", 00:07:45.593 "bdev_name": "Malloc2p7" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd11", 00:07:45.593 "bdev_name": "TestPT" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd12", 00:07:45.593 "bdev_name": "raid0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd13", 00:07:45.593 "bdev_name": "concat0" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd14", 00:07:45.593 "bdev_name": "raid1" 00:07:45.593 }, 00:07:45.593 { 00:07:45.593 "nbd_device": "/dev/nbd15", 00:07:45.593 "bdev_name": "AIO0" 00:07:45.593 } 00:07:45.593 ]' 00:07:45.593 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:45.851 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.107 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.363 13:30:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.620 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.877 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.878 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.135 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.136 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.136 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.393 13:30:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.651 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.909 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:48.166 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:48.166 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:48.166 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:48.166 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.166 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.166 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:48.167 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.167 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.167 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.167 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.424 13:30:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.424 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.682 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:48.940 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:49.198 /dev/nbd0 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.198 1+0 records in 00:07:49.198 1+0 records out 00:07:49.198 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025111 s, 16.3 MB/s 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.198 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:07:49.456 /dev/nbd1 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.456 1+0 records in 00:07:49.456 1+0 records out 00:07:49.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00020915 s, 19.6 MB/s 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.456 13:30:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:07:49.456 /dev/nbd10 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.714 1+0 records in 00:07:49.714 1+0 records out 00:07:49.714 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277051 s, 14.8 MB/s 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:07:49.714 /dev/nbd11 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.714 1+0 records in 00:07:49.714 1+0 records out 00:07:49.714 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258878 s, 15.8 MB/s 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.714 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:07:49.972 /dev/nbd12 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.972 1+0 records in 00:07:49.972 1+0 records out 00:07:49.972 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313355 s, 13.1 MB/s 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.972 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:07:50.229 /dev/nbd13 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.229 1+0 records in 00:07:50.229 1+0 records out 00:07:50.229 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000380183 s, 10.8 MB/s 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.229 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:07:50.487 /dev/nbd14 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.487 1+0 records in 00:07:50.487 1+0 records out 00:07:50.487 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218689 s, 18.7 MB/s 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.487 13:30:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:07:50.487 /dev/nbd15 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.745 1+0 records in 00:07:50.745 1+0 records out 00:07:50.745 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359349 s, 11.4 MB/s 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:07:50.745 /dev/nbd2 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.745 1+0 records in 00:07:50.745 1+0 records out 00:07:50.745 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359261 s, 11.4 MB/s 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.745 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:07:51.002 /dev/nbd3 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.002 1+0 records in 00:07:51.002 1+0 records out 00:07:51.002 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000469823 s, 8.7 MB/s 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.002 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:07:51.259 /dev/nbd4 00:07:51.259 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:07:51.259 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.260 1+0 records in 00:07:51.260 1+0 records out 00:07:51.260 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000420915 s, 9.7 MB/s 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.260 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:07:51.517 /dev/nbd5 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.518 1+0 records in 00:07:51.518 1+0 records out 00:07:51.518 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00052081 s, 7.9 MB/s 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.518 13:30:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.518 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.518 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.518 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:07:51.776 /dev/nbd6 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.776 1+0 records in 00:07:51.776 1+0 records out 00:07:51.776 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000578895 s, 7.1 MB/s 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.776 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:07:51.776 /dev/nbd7 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:52.035 1+0 records in 00:07:52.035 1+0 records out 00:07:52.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000442994 s, 9.2 MB/s 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:07:52.035 /dev/nbd8 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:52.035 1+0 records in 00:07:52.035 1+0 records out 00:07:52.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000580994 s, 7.0 MB/s 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:52.035 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:07:52.293 /dev/nbd9 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:52.293 1+0 records in 00:07:52.293 1+0 records out 00:07:52.293 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000521123 s, 7.9 MB/s 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.293 13:30:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd0", 00:07:52.551 "bdev_name": "Malloc0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd1", 00:07:52.551 "bdev_name": "Malloc1p0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd10", 00:07:52.551 "bdev_name": "Malloc1p1" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd11", 00:07:52.551 "bdev_name": "Malloc2p0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd12", 00:07:52.551 "bdev_name": "Malloc2p1" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd13", 00:07:52.551 "bdev_name": "Malloc2p2" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd14", 00:07:52.551 "bdev_name": "Malloc2p3" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd15", 00:07:52.551 "bdev_name": "Malloc2p4" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd2", 00:07:52.551 "bdev_name": "Malloc2p5" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd3", 00:07:52.551 "bdev_name": "Malloc2p6" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd4", 00:07:52.551 "bdev_name": "Malloc2p7" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd5", 00:07:52.551 "bdev_name": "TestPT" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd6", 00:07:52.551 "bdev_name": "raid0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd7", 00:07:52.551 "bdev_name": "concat0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd8", 00:07:52.551 "bdev_name": "raid1" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd9", 00:07:52.551 "bdev_name": "AIO0" 00:07:52.551 } 00:07:52.551 ]' 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd0", 00:07:52.551 "bdev_name": "Malloc0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd1", 00:07:52.551 "bdev_name": "Malloc1p0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd10", 00:07:52.551 "bdev_name": "Malloc1p1" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd11", 00:07:52.551 "bdev_name": "Malloc2p0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd12", 00:07:52.551 "bdev_name": "Malloc2p1" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd13", 00:07:52.551 "bdev_name": "Malloc2p2" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd14", 00:07:52.551 "bdev_name": "Malloc2p3" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd15", 00:07:52.551 "bdev_name": "Malloc2p4" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd2", 00:07:52.551 "bdev_name": "Malloc2p5" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd3", 00:07:52.551 "bdev_name": "Malloc2p6" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd4", 00:07:52.551 "bdev_name": "Malloc2p7" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd5", 00:07:52.551 "bdev_name": "TestPT" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd6", 00:07:52.551 "bdev_name": "raid0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd7", 00:07:52.551 "bdev_name": "concat0" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd8", 00:07:52.551 "bdev_name": "raid1" 00:07:52.551 }, 00:07:52.551 { 00:07:52.551 "nbd_device": "/dev/nbd9", 00:07:52.551 "bdev_name": "AIO0" 00:07:52.551 } 00:07:52.551 ]' 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:52.551 /dev/nbd1 00:07:52.551 /dev/nbd10 00:07:52.551 /dev/nbd11 00:07:52.551 /dev/nbd12 00:07:52.551 /dev/nbd13 00:07:52.551 /dev/nbd14 00:07:52.551 /dev/nbd15 00:07:52.551 /dev/nbd2 00:07:52.551 /dev/nbd3 00:07:52.551 /dev/nbd4 00:07:52.551 /dev/nbd5 00:07:52.551 /dev/nbd6 00:07:52.551 /dev/nbd7 00:07:52.551 /dev/nbd8 00:07:52.551 /dev/nbd9' 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:52.551 /dev/nbd1 00:07:52.551 /dev/nbd10 00:07:52.551 /dev/nbd11 00:07:52.551 /dev/nbd12 00:07:52.551 /dev/nbd13 00:07:52.551 /dev/nbd14 00:07:52.551 /dev/nbd15 00:07:52.551 /dev/nbd2 00:07:52.551 /dev/nbd3 00:07:52.551 /dev/nbd4 00:07:52.551 /dev/nbd5 00:07:52.551 /dev/nbd6 00:07:52.551 /dev/nbd7 00:07:52.551 /dev/nbd8 00:07:52.551 /dev/nbd9' 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:52.551 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:52.552 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:52.552 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:52.552 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:07:52.552 256+0 records in 00:07:52.552 256+0 records out 00:07:52.552 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.01145 s, 91.6 MB/s 00:07:52.552 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:52.552 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:52.809 256+0 records in 00:07:52.809 256+0 records out 00:07:52.809 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0935707 s, 11.2 MB/s 00:07:52.809 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:52.809 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:52.809 256+0 records in 00:07:52.809 256+0 records out 00:07:52.809 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.149679 s, 7.0 MB/s 00:07:52.809 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:52.809 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:07:53.067 256+0 records in 00:07:53.067 256+0 records out 00:07:53.067 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121615 s, 8.6 MB/s 00:07:53.067 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.067 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:07:53.067 256+0 records in 00:07:53.067 256+0 records out 00:07:53.067 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121297 s, 8.6 MB/s 00:07:53.067 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.067 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:07:53.324 256+0 records in 00:07:53.324 256+0 records out 00:07:53.324 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120922 s, 8.7 MB/s 00:07:53.324 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.324 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:07:53.324 256+0 records in 00:07:53.324 256+0 records out 00:07:53.324 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121474 s, 8.6 MB/s 00:07:53.324 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.324 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:07:53.580 256+0 records in 00:07:53.580 256+0 records out 00:07:53.580 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121716 s, 8.6 MB/s 00:07:53.581 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.581 13:30:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:07:53.581 256+0 records in 00:07:53.581 256+0 records out 00:07:53.581 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.116015 s, 9.0 MB/s 00:07:53.581 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.581 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:07:53.838 256+0 records in 00:07:53.838 256+0 records out 00:07:53.838 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120991 s, 8.7 MB/s 00:07:53.838 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.838 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:07:53.838 256+0 records in 00:07:53.838 256+0 records out 00:07:53.838 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121023 s, 8.7 MB/s 00:07:53.838 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.838 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:07:54.095 256+0 records in 00:07:54.095 256+0 records out 00:07:54.095 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122305 s, 8.6 MB/s 00:07:54.095 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:54.095 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:07:54.095 256+0 records in 00:07:54.095 256+0 records out 00:07:54.095 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.123205 s, 8.5 MB/s 00:07:54.095 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:54.095 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:07:54.352 256+0 records in 00:07:54.352 256+0 records out 00:07:54.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122563 s, 8.6 MB/s 00:07:54.352 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:54.352 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:07:54.352 256+0 records in 00:07:54.352 256+0 records out 00:07:54.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122669 s, 8.5 MB/s 00:07:54.352 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:54.352 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:07:54.610 256+0 records in 00:07:54.610 256+0 records out 00:07:54.610 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.123918 s, 8.5 MB/s 00:07:54.610 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:07:54.610 256+0 records in 00:07:54.610 256+0 records out 00:07:54.610 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120446 s, 8.7 MB/s 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:07:54.610 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.868 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.125 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.381 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.382 13:30:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.639 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.896 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.154 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.411 13:30:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:56.668 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:56.668 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:56.668 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:56.669 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.669 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.669 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:56.669 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.669 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.669 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.669 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.926 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:57.183 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:57.440 13:30:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:57.697 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:57.955 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:58.212 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:07:58.213 malloc_lvol_verify 00:07:58.213 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:07:58.470 1319c65d-bcdf-44f6-a6a0-fdaee3b05b9b 00:07:58.470 13:30:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:07:58.728 be42f209-70a9-40b6-9811-ecf81429dc5c 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:58.728 /dev/nbd0 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:07:58.728 mke2fs 1.46.5 (30-Dec-2021) 00:07:58.728 Discarding device blocks: 0/4096 done 00:07:58.728 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:58.728 00:07:58.728 Allocating group tables: 0/1 done 00:07:58.728 Writing inode tables: 0/1 done 00:07:58.728 Creating journal (1024 blocks): done 00:07:58.728 Writing superblocks and filesystem accounting information: 0/1 done 00:07:58.728 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:58.728 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 4148695 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 4148695 ']' 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 4148695 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4148695 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4148695' 00:07:58.986 killing process with pid 4148695 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 4148695 00:07:58.986 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 4148695 00:07:59.554 13:30:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:07:59.554 00:07:59.554 real 0m18.270s 00:07:59.554 user 0m21.728s 00:07:59.554 sys 0m10.717s 00:07:59.554 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:59.554 13:30:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:59.554 ************************************ 00:07:59.554 END TEST bdev_nbd 00:07:59.554 ************************************ 00:07:59.554 13:30:46 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:07:59.554 13:30:46 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:07:59.554 13:30:46 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:07:59.554 13:30:46 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:07:59.554 13:30:46 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:07:59.554 13:30:46 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:59.554 13:30:46 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.554 13:30:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:59.554 ************************************ 00:07:59.554 START TEST bdev_fio 00:07:59.554 ************************************ 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:59.554 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:07:59.554 13:30:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.554 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.555 13:30:47 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:59.555 ************************************ 00:07:59.555 START TEST bdev_fio_rw_verify 00:07:59.555 ************************************ 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:07:59.555 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:59.813 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:59.813 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:59.813 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:59.813 13:30:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:00.071 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:00.071 fio-3.35 00:08:00.071 Starting 16 threads 00:08:12.340 00:08:12.340 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=4152005: Mon Jul 15 13:30:58 2024 00:08:12.340 read: IOPS=108k, BW=421MiB/s (441MB/s)(4209MiB/10001msec) 00:08:12.340 slat (nsec): min=1916, max=241661, avg=29278.03, stdev=12523.10 00:08:12.340 clat (usec): min=8, max=887, avg=249.59, stdev=117.64 00:08:12.340 lat (usec): min=17, max=935, avg=278.87, stdev=123.79 00:08:12.340 clat percentiles (usec): 00:08:12.340 | 50.000th=[ 243], 99.000th=[ 506], 99.900th=[ 586], 99.990th=[ 685], 00:08:12.340 | 99.999th=[ 783] 00:08:12.340 write: IOPS=169k, BW=659MiB/s (691MB/s)(6511MiB/9875msec); 0 zone resets 00:08:12.340 slat (usec): min=4, max=3166, avg=40.01, stdev=12.65 00:08:12.340 clat (usec): min=10, max=3460, avg=291.78, stdev=131.30 00:08:12.340 lat (usec): min=26, max=3489, avg=331.79, stdev=137.13 00:08:12.340 clat percentiles (usec): 00:08:12.340 | 50.000th=[ 281], 99.000th=[ 594], 99.900th=[ 750], 99.990th=[ 840], 00:08:12.340 | 99.999th=[ 938] 00:08:12.340 bw ( KiB/s): min=566688, max=894118, per=99.12%, avg=669256.00, stdev=5571.59, samples=304 00:08:12.340 iops : min=141672, max=223527, avg=167313.79, stdev=1392.86, samples=304 00:08:12.340 lat (usec) : 10=0.01%, 20=0.04%, 50=1.11%, 100=6.53%, 250=38.50% 00:08:12.340 lat (usec) : 500=49.44%, 750=4.32%, 1000=0.06% 00:08:12.340 lat (msec) : 2=0.01%, 4=0.01% 00:08:12.340 cpu : usr=99.19%, sys=0.41%, ctx=687, majf=0, minf=2514 00:08:12.340 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:12.340 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:12.340 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:12.340 issued rwts: total=1077432,1666882,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:12.340 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:12.340 00:08:12.340 Run status group 0 (all jobs): 00:08:12.340 READ: bw=421MiB/s (441MB/s), 421MiB/s-421MiB/s (441MB/s-441MB/s), io=4209MiB (4413MB), run=10001-10001msec 00:08:12.340 WRITE: bw=659MiB/s (691MB/s), 659MiB/s-659MiB/s (691MB/s-691MB/s), io=6511MiB (6828MB), run=9875-9875msec 00:08:12.340 00:08:12.340 real 0m11.610s 00:08:12.340 user 2m44.395s 00:08:12.340 sys 0m1.332s 00:08:12.340 13:30:58 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:12.340 13:30:58 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:08:12.340 ************************************ 00:08:12.340 END TEST bdev_fio_rw_verify 00:08:12.340 ************************************ 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:12.340 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:12.341 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:12.341 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:08:12.341 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:08:12.341 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:08:12.341 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:12.342 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "48aa72e9-5f15-4c80-a523-7f7dfaf32119"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "48aa72e9-5f15-4c80-a523-7f7dfaf32119",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e21b7357-ffe2-5fdc-b094-4d118ac5ad60"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e21b7357-ffe2-5fdc-b094-4d118ac5ad60",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "35f3a3da-62c6-566e-b025-6d3e8a7d1154"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "35f3a3da-62c6-566e-b025-6d3e8a7d1154",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "363685c1-16d0-51be-a5e2-2b7e8208775a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "363685c1-16d0-51be-a5e2-2b7e8208775a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "b3a2c8bd-5209-545c-95a7-8acf29cd4c78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b3a2c8bd-5209-545c-95a7-8acf29cd4c78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b0e2e76d-7c52-5ddb-8294-12ef7ae1b6f5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b0e2e76d-7c52-5ddb-8294-12ef7ae1b6f5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "eb774bb3-c666-5cf2-b8e1-f22fa2700420"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "eb774bb3-c666-5cf2-b8e1-f22fa2700420",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "67b72e79-c0f8-5115-a477-c17b8f81a4e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67b72e79-c0f8-5115-a477-c17b8f81a4e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "c91e2198-f4d6-5eac-a830-916aca99aff2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c91e2198-f4d6-5eac-a830-916aca99aff2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c80adbba-b879-55e2-ac1e-429ce972f83e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c80adbba-b879-55e2-ac1e-429ce972f83e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e265e266-c67d-581e-a123-a6d59d2e4feb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e265e266-c67d-581e-a123-a6d59d2e4feb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "5209a344-f469-546a-aa61-d17b62e961e8"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5209a344-f469-546a-aa61-d17b62e961e8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "4c5cebaa-e3ef-4f89-8387-dfd86af2a668"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4c5cebaa-e3ef-4f89-8387-dfd86af2a668",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4c5cebaa-e3ef-4f89-8387-dfd86af2a668",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5547bc2d-e462-4cc2-855e-12666afbfc68",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "579f7b31-181b-4f86-b409-9aff863d20e0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "6531a738-0867-45d8-8c5a-29898c4ce58d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "6531a738-0867-45d8-8c5a-29898c4ce58d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "6531a738-0867-45d8-8c5a-29898c4ce58d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "5fb74f07-b8a8-4982-80a5-1caa213f221c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "9b41b728-fd4e-44ae-9f3f-cb45a43c6e5f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "8c8c3bf6-3c02-450b-96b5-57fa82f54c4f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "a0d722c7-a440-424c-8ac0-5e285abb118d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "f5c49728-c43b-4df9-835d-d6301e7bccc4"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "f5c49728-c43b-4df9-835d-d6301e7bccc4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:12.342 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:08:12.342 Malloc1p0 00:08:12.342 Malloc1p1 00:08:12.342 Malloc2p0 00:08:12.342 Malloc2p1 00:08:12.342 Malloc2p2 00:08:12.342 Malloc2p3 00:08:12.342 Malloc2p4 00:08:12.342 Malloc2p5 00:08:12.342 Malloc2p6 00:08:12.342 Malloc2p7 00:08:12.342 TestPT 00:08:12.342 raid0 00:08:12.342 concat0 ]] 00:08:12.342 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "48aa72e9-5f15-4c80-a523-7f7dfaf32119"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "48aa72e9-5f15-4c80-a523-7f7dfaf32119",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e21b7357-ffe2-5fdc-b094-4d118ac5ad60"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e21b7357-ffe2-5fdc-b094-4d118ac5ad60",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "35f3a3da-62c6-566e-b025-6d3e8a7d1154"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "35f3a3da-62c6-566e-b025-6d3e8a7d1154",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "363685c1-16d0-51be-a5e2-2b7e8208775a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "363685c1-16d0-51be-a5e2-2b7e8208775a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "b3a2c8bd-5209-545c-95a7-8acf29cd4c78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b3a2c8bd-5209-545c-95a7-8acf29cd4c78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b0e2e76d-7c52-5ddb-8294-12ef7ae1b6f5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b0e2e76d-7c52-5ddb-8294-12ef7ae1b6f5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "eb774bb3-c666-5cf2-b8e1-f22fa2700420"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "eb774bb3-c666-5cf2-b8e1-f22fa2700420",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "67b72e79-c0f8-5115-a477-c17b8f81a4e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67b72e79-c0f8-5115-a477-c17b8f81a4e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "c91e2198-f4d6-5eac-a830-916aca99aff2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c91e2198-f4d6-5eac-a830-916aca99aff2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c80adbba-b879-55e2-ac1e-429ce972f83e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c80adbba-b879-55e2-ac1e-429ce972f83e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e265e266-c67d-581e-a123-a6d59d2e4feb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e265e266-c67d-581e-a123-a6d59d2e4feb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "5209a344-f469-546a-aa61-d17b62e961e8"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5209a344-f469-546a-aa61-d17b62e961e8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "4c5cebaa-e3ef-4f89-8387-dfd86af2a668"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4c5cebaa-e3ef-4f89-8387-dfd86af2a668",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4c5cebaa-e3ef-4f89-8387-dfd86af2a668",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5547bc2d-e462-4cc2-855e-12666afbfc68",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "579f7b31-181b-4f86-b409-9aff863d20e0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "6531a738-0867-45d8-8c5a-29898c4ce58d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "6531a738-0867-45d8-8c5a-29898c4ce58d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "6531a738-0867-45d8-8c5a-29898c4ce58d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "5fb74f07-b8a8-4982-80a5-1caa213f221c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "9b41b728-fd4e-44ae-9f3f-cb45a43c6e5f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "25abd9c4-4a2f-40f5-b99a-33dcee3a1ccf",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "8c8c3bf6-3c02-450b-96b5-57fa82f54c4f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "a0d722c7-a440-424c-8ac0-5e285abb118d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "f5c49728-c43b-4df9-835d-d6301e7bccc4"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "f5c49728-c43b-4df9-835d-d6301e7bccc4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:08:12.343 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:08:12.344 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:12.344 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:08:12.344 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:08:12.344 13:30:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:12.344 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:12.344 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.344 13:30:58 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:12.344 ************************************ 00:08:12.344 START TEST bdev_fio_trim 00:08:12.344 ************************************ 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:12.344 13:30:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:12.344 13:30:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:12.344 13:30:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:12.344 13:30:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:12.344 13:30:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:12.344 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:12.344 fio-3.35 00:08:12.344 Starting 14 threads 00:08:24.533 00:08:24.533 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=4153711: Mon Jul 15 13:31:09 2024 00:08:24.533 write: IOPS=141k, BW=552MiB/s (578MB/s)(5517MiB/10001msec); 0 zone resets 00:08:24.533 slat (nsec): min=1934, max=165807, avg=34965.47, stdev=9909.16 00:08:24.533 clat (usec): min=23, max=4229, avg=249.33, stdev=86.06 00:08:24.533 lat (usec): min=34, max=4254, avg=284.30, stdev=89.57 00:08:24.533 clat percentiles (usec): 00:08:24.533 | 50.000th=[ 243], 99.000th=[ 429], 99.900th=[ 469], 99.990th=[ 594], 00:08:24.533 | 99.999th=[ 1012] 00:08:24.533 bw ( KiB/s): min=518112, max=791004, per=100.00%, avg=566566.11, stdev=4549.54, samples=266 00:08:24.533 iops : min=129528, max=197749, avg=141641.42, stdev=1137.36, samples=266 00:08:24.533 trim: IOPS=141k, BW=552MiB/s (578MB/s)(5517MiB/10001msec); 0 zone resets 00:08:24.533 slat (usec): min=3, max=216, avg=23.49, stdev= 6.34 00:08:24.533 clat (usec): min=3, max=4254, avg=282.58, stdev=91.37 00:08:24.533 lat (usec): min=12, max=4280, avg=306.07, stdev=94.04 00:08:24.533 clat percentiles (usec): 00:08:24.533 | 50.000th=[ 277], 99.000th=[ 469], 99.900th=[ 510], 99.990th=[ 635], 00:08:24.533 | 99.999th=[ 1123] 00:08:24.533 bw ( KiB/s): min=518112, max=791004, per=100.00%, avg=566566.11, stdev=4549.61, samples=266 00:08:24.533 iops : min=129528, max=197749, avg=141641.42, stdev=1137.38, samples=266 00:08:24.533 lat (usec) : 4=0.01%, 10=0.01%, 20=0.03%, 50=0.18%, 100=1.65% 00:08:24.533 lat (usec) : 250=44.96%, 500=53.08%, 750=0.10%, 1000=0.01% 00:08:24.533 lat (msec) : 2=0.01%, 10=0.01% 00:08:24.533 cpu : usr=99.64%, sys=0.01%, ctx=576, majf=0, minf=1033 00:08:24.533 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:24.533 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:24.533 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:24.533 issued rwts: total=0,1412326,1412330,0 short=0,0,0,0 dropped=0,0,0,0 00:08:24.533 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:24.533 00:08:24.533 Run status group 0 (all jobs): 00:08:24.533 WRITE: bw=552MiB/s (578MB/s), 552MiB/s-552MiB/s (578MB/s-578MB/s), io=5517MiB (5785MB), run=10001-10001msec 00:08:24.533 TRIM: bw=552MiB/s (578MB/s), 552MiB/s-552MiB/s (578MB/s-578MB/s), io=5517MiB (5785MB), run=10001-10001msec 00:08:24.533 00:08:24.533 real 0m11.383s 00:08:24.533 user 2m25.155s 00:08:24.533 sys 0m0.684s 00:08:24.533 13:31:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.533 13:31:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:08:24.533 ************************************ 00:08:24.533 END TEST bdev_fio_trim 00:08:24.533 ************************************ 00:08:24.533 13:31:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:08:24.533 13:31:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:08:24.533 13:31:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:24.533 13:31:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:08:24.533 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:24.533 13:31:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:08:24.533 00:08:24.533 real 0m23.371s 00:08:24.533 user 5m9.759s 00:08:24.533 sys 0m2.214s 00:08:24.533 13:31:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.533 13:31:10 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:24.533 ************************************ 00:08:24.533 END TEST bdev_fio 00:08:24.533 ************************************ 00:08:24.533 13:31:10 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:24.533 13:31:10 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:24.533 13:31:10 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:24.533 13:31:10 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:08:24.533 13:31:10 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:24.533 13:31:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:24.534 ************************************ 00:08:24.534 START TEST bdev_verify 00:08:24.534 ************************************ 00:08:24.534 13:31:10 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:24.534 [2024-07-15 13:31:10.492818] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:08:24.534 [2024-07-15 13:31:10.492867] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4155159 ] 00:08:24.534 [2024-07-15 13:31:10.579754] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:24.534 [2024-07-15 13:31:10.669453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:24.534 [2024-07-15 13:31:10.669454] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.534 [2024-07-15 13:31:10.815885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:24.534 [2024-07-15 13:31:10.815930] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:24.534 [2024-07-15 13:31:10.815944] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:24.534 [2024-07-15 13:31:10.823896] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:24.534 [2024-07-15 13:31:10.823914] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:24.534 [2024-07-15 13:31:10.831919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:24.534 [2024-07-15 13:31:10.831935] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:24.534 [2024-07-15 13:31:10.900099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:24.534 [2024-07-15 13:31:10.900143] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:24.534 [2024-07-15 13:31:10.900156] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13771a0 00:08:24.534 [2024-07-15 13:31:10.900180] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:24.534 [2024-07-15 13:31:10.901422] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:24.534 [2024-07-15 13:31:10.901445] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:24.534 Running I/O for 5 seconds... 00:08:28.719 00:08:28.719 Latency(us) 00:08:28.719 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:28.719 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.719 Verification LBA range: start 0x0 length 0x1000 00:08:28.720 Malloc0 : 5.12 1649.46 6.44 0.00 0.00 77492.11 445.22 173242.99 00:08:28.720 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x1000 length 0x1000 00:08:28.720 Malloc0 : 5.11 1626.71 6.35 0.00 0.00 78566.95 372.20 271717.95 00:08:28.720 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x800 00:08:28.720 Malloc1p0 : 5.16 843.47 3.29 0.00 0.00 151170.56 2607.19 165036.74 00:08:28.720 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x800 length 0x800 00:08:28.720 Malloc1p0 : 5.15 844.91 3.30 0.00 0.00 150927.14 2607.19 155006.89 00:08:28.720 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x800 00:08:28.720 Malloc1p1 : 5.16 843.24 3.29 0.00 0.00 150929.19 2621.44 162301.33 00:08:28.720 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x800 length 0x800 00:08:28.720 Malloc1p1 : 5.15 844.65 3.30 0.00 0.00 150673.30 2635.69 153183.28 00:08:28.720 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p0 : 5.16 843.00 3.29 0.00 0.00 150664.20 2721.17 159565.91 00:08:28.720 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p0 : 5.15 844.39 3.30 0.00 0.00 150416.08 2735.42 149536.06 00:08:28.720 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p1 : 5.16 842.77 3.29 0.00 0.00 150391.15 2678.43 155006.89 00:08:28.720 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p1 : 5.16 844.12 3.30 0.00 0.00 150145.61 2678.43 147712.45 00:08:28.720 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p2 : 5.17 842.53 3.29 0.00 0.00 150104.19 2635.69 152271.47 00:08:28.720 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p2 : 5.16 843.85 3.30 0.00 0.00 149869.90 2664.18 142241.61 00:08:28.720 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p3 : 5.17 842.22 3.29 0.00 0.00 149836.65 2635.69 148624.25 00:08:28.720 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p3 : 5.16 843.59 3.30 0.00 0.00 149606.87 2635.69 139506.20 00:08:28.720 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p4 : 5.17 841.75 3.29 0.00 0.00 149597.60 2564.45 144977.03 00:08:28.720 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p4 : 5.16 843.37 3.29 0.00 0.00 149328.77 2564.45 136770.78 00:08:28.720 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p5 : 5.17 841.35 3.29 0.00 0.00 149376.47 2564.45 142241.61 00:08:28.720 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p5 : 5.16 843.14 3.29 0.00 0.00 149059.37 2550.21 133123.56 00:08:28.720 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p6 : 5.17 841.10 3.29 0.00 0.00 149114.96 2493.22 138594.39 00:08:28.720 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p6 : 5.16 842.90 3.29 0.00 0.00 148810.24 2507.46 129476.34 00:08:28.720 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x200 00:08:28.720 Malloc2p7 : 5.18 840.86 3.28 0.00 0.00 148856.58 2535.96 135858.98 00:08:28.720 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x200 length 0x200 00:08:28.720 Malloc2p7 : 5.16 842.67 3.29 0.00 0.00 148544.54 2521.71 126740.93 00:08:28.720 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x1000 00:08:28.720 TestPT : 5.19 838.94 3.28 0.00 0.00 148847.89 9402.99 136770.78 00:08:28.720 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x1000 length 0x1000 00:08:28.720 TestPT : 5.18 819.11 3.20 0.00 0.00 152117.34 11112.63 183272.85 00:08:28.720 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x2000 00:08:28.720 raid0 : 5.18 840.34 3.28 0.00 0.00 148236.63 2578.70 120358.29 00:08:28.720 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x2000 length 0x2000 00:08:28.720 raid0 : 5.17 842.20 3.29 0.00 0.00 147939.65 2607.19 109872.53 00:08:28.720 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x2000 00:08:28.720 concat0 : 5.18 840.10 3.28 0.00 0.00 147996.28 2535.96 117622.87 00:08:28.720 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x2000 length 0x2000 00:08:28.720 concat0 : 5.17 841.74 3.29 0.00 0.00 147722.56 2521.71 106225.31 00:08:28.720 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x1000 00:08:28.720 raid1 : 5.18 839.70 3.28 0.00 0.00 147739.77 3048.85 112607.94 00:08:28.720 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x1000 length 0x1000 00:08:28.720 raid1 : 5.19 863.92 3.37 0.00 0.00 143643.55 2008.82 110328.43 00:08:28.720 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x0 length 0x4e2 00:08:28.720 AIO0 : 5.19 863.25 3.37 0.00 0.00 143399.24 690.98 115343.36 00:08:28.720 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.720 Verification LBA range: start 0x4e2 length 0x4e2 00:08:28.720 AIO0 : 5.19 863.74 3.37 0.00 0.00 143362.42 1168.25 114887.46 00:08:28.720 =================================================================================================================== 00:08:28.720 Total : 28589.08 111.68 0.00 0.00 140874.12 372.20 271717.95 00:08:29.287 00:08:29.287 real 0m6.272s 00:08:29.287 user 0m11.730s 00:08:29.287 sys 0m0.348s 00:08:29.287 13:31:16 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.287 13:31:16 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:08:29.287 ************************************ 00:08:29.287 END TEST bdev_verify 00:08:29.287 ************************************ 00:08:29.287 13:31:16 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:29.287 13:31:16 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:29.287 13:31:16 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:08:29.287 13:31:16 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.287 13:31:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:29.287 ************************************ 00:08:29.287 START TEST bdev_verify_big_io 00:08:29.287 ************************************ 00:08:29.287 13:31:16 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:29.287 [2024-07-15 13:31:16.857856] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:08:29.287 [2024-07-15 13:31:16.857913] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4155889 ] 00:08:29.546 [2024-07-15 13:31:16.947381] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:29.546 [2024-07-15 13:31:17.034075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:29.546 [2024-07-15 13:31:17.034077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.804 [2024-07-15 13:31:17.175260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:29.804 [2024-07-15 13:31:17.175306] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:29.804 [2024-07-15 13:31:17.175316] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:29.804 [2024-07-15 13:31:17.183255] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:29.804 [2024-07-15 13:31:17.183273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:29.804 [2024-07-15 13:31:17.191274] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:29.804 [2024-07-15 13:31:17.191290] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:29.804 [2024-07-15 13:31:17.259418] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:29.804 [2024-07-15 13:31:17.259465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:29.804 [2024-07-15 13:31:17.259482] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc91a0 00:08:29.804 [2024-07-15 13:31:17.259491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:29.804 [2024-07-15 13:31:17.260741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:29.804 [2024-07-15 13:31:17.260763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:29.804 [2024-07-15 13:31:17.412212] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.413090] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.414422] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.415271] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.416604] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.417450] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.418729] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.419978] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.420752] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:29.804 [2024-07-15 13:31:17.422004] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:30.063 [2024-07-15 13:31:17.422761] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:30.063 [2024-07-15 13:31:17.424004] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:30.063 [2024-07-15 13:31:17.424758] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:30.063 [2024-07-15 13:31:17.426028] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:30.063 [2024-07-15 13:31:17.426796] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:30.063 [2024-07-15 13:31:17.428038] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:30.063 [2024-07-15 13:31:17.449616] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:30.063 [2024-07-15 13:31:17.451449] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:30.063 Running I/O for 5 seconds... 00:08:36.639 00:08:36.639 Latency(us) 00:08:36.639 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:36.639 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x0 length 0x100 00:08:36.639 Malloc0 : 5.54 323.65 20.23 0.00 0.00 390021.10 580.56 1174405.12 00:08:36.639 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x100 length 0x100 00:08:36.639 Malloc0 : 5.50 279.28 17.46 0.00 0.00 452207.68 580.56 1400532.81 00:08:36.639 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x0 length 0x80 00:08:36.639 Malloc1p0 : 6.14 54.76 3.42 0.00 0.00 2163413.25 1716.76 3413798.73 00:08:36.639 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x80 length 0x80 00:08:36.639 Malloc1p0 : 5.74 147.64 9.23 0.00 0.00 816565.22 2535.96 1641249.39 00:08:36.639 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x0 length 0x80 00:08:36.639 Malloc1p1 : 6.14 54.75 3.42 0.00 0.00 2117208.56 1047.15 3311676.55 00:08:36.639 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x80 length 0x80 00:08:36.639 Malloc1p1 : 5.98 53.52 3.34 0.00 0.00 2194836.98 1097.02 3457565.38 00:08:36.639 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x0 length 0x20 00:08:36.639 Malloc2p0 : 5.78 44.32 2.77 0.00 0.00 669210.83 502.21 1276527.30 00:08:36.639 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x20 length 0x20 00:08:36.639 Malloc2p0 : 5.75 41.77 2.61 0.00 0.00 709716.58 498.64 1210877.33 00:08:36.639 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x0 length 0x20 00:08:36.639 Malloc2p1 : 5.78 44.31 2.77 0.00 0.00 665562.37 477.27 1254643.98 00:08:36.639 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x20 length 0x20 00:08:36.639 Malloc2p1 : 5.75 41.77 2.61 0.00 0.00 706254.71 509.33 1196288.45 00:08:36.639 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x0 length 0x20 00:08:36.639 Malloc2p2 : 5.78 44.31 2.77 0.00 0.00 661664.34 480.83 1240055.10 00:08:36.639 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.639 Verification LBA range: start 0x20 length 0x20 00:08:36.640 Malloc2p2 : 5.75 41.76 2.61 0.00 0.00 702234.59 512.89 1181699.56 00:08:36.640 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x20 00:08:36.640 Malloc2p3 : 5.78 44.30 2.77 0.00 0.00 658169.13 495.08 1225466.21 00:08:36.640 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x20 length 0x20 00:08:36.640 Malloc2p3 : 5.75 41.75 2.61 0.00 0.00 698319.27 509.33 1167110.68 00:08:36.640 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x20 00:08:36.640 Malloc2p4 : 5.78 44.29 2.77 0.00 0.00 654348.58 487.96 1203582.89 00:08:36.640 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x20 length 0x20 00:08:36.640 Malloc2p4 : 5.75 41.75 2.61 0.00 0.00 694303.16 505.77 1152521.79 00:08:36.640 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x20 00:08:36.640 Malloc2p5 : 5.78 44.29 2.77 0.00 0.00 650550.94 480.83 1188994.00 00:08:36.640 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x20 length 0x20 00:08:36.640 Malloc2p5 : 5.75 41.74 2.61 0.00 0.00 690842.02 495.08 1137932.91 00:08:36.640 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x20 00:08:36.640 Malloc2p6 : 5.78 44.28 2.77 0.00 0.00 647149.78 601.93 1174405.12 00:08:36.640 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x20 length 0x20 00:08:36.640 Malloc2p6 : 5.75 41.73 2.61 0.00 0.00 686443.48 591.25 1116049.59 00:08:36.640 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x20 00:08:36.640 Malloc2p7 : 5.78 44.27 2.77 0.00 0.00 644089.50 651.80 1159816.24 00:08:36.640 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x20 length 0x20 00:08:36.640 Malloc2p7 : 5.75 41.73 2.61 0.00 0.00 682639.98 648.24 1101460.70 00:08:36.640 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x100 00:08:36.640 TestPT : 6.15 57.22 3.58 0.00 0.00 1918878.57 1061.40 3078254.41 00:08:36.640 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x100 length 0x100 00:08:36.640 TestPT : 6.08 52.83 3.30 0.00 0.00 2091439.01 57215.78 2990721.11 00:08:36.640 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x200 00:08:36.640 raid0 : 5.97 67.33 4.21 0.00 0.00 1630519.32 1267.98 2976132.23 00:08:36.640 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x200 length 0x200 00:08:36.640 raid0 : 6.11 60.25 3.77 0.00 0.00 1800686.36 1239.49 3122021.06 00:08:36.640 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x200 00:08:36.640 concat0 : 6.14 72.98 4.56 0.00 0.00 1455490.79 1132.63 2859421.16 00:08:36.640 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x200 length 0x200 00:08:36.640 concat0 : 5.98 77.55 4.85 0.00 0.00 1396899.46 1175.37 3019898.88 00:08:36.640 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x100 00:08:36.640 raid1 : 6.15 91.82 5.74 0.00 0.00 1154749.90 1424.70 2757298.98 00:08:36.640 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x100 length 0x100 00:08:36.640 raid1 : 6.08 76.33 4.77 0.00 0.00 1387426.24 1481.68 2917776.70 00:08:36.640 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x0 length 0x4e 00:08:36.640 AIO0 : 6.16 77.33 4.83 0.00 0.00 820512.43 534.26 1677721.60 00:08:36.640 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:08:36.640 Verification LBA range: start 0x4e length 0x4e 00:08:36.640 AIO0 : 6.14 87.59 5.47 0.00 0.00 726382.13 612.62 1728782.69 00:08:36.640 =================================================================================================================== 00:08:36.640 Total : 2323.20 145.20 0.00 0.00 956883.50 477.27 3457565.38 00:08:36.640 00:08:36.640 real 0m7.299s 00:08:36.640 user 0m13.731s 00:08:36.640 sys 0m0.385s 00:08:36.640 13:31:24 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.640 13:31:24 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:08:36.640 ************************************ 00:08:36.640 END TEST bdev_verify_big_io 00:08:36.640 ************************************ 00:08:36.640 13:31:24 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:36.640 13:31:24 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:36.640 13:31:24 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:36.640 13:31:24 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.640 13:31:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:36.640 ************************************ 00:08:36.640 START TEST bdev_write_zeroes 00:08:36.640 ************************************ 00:08:36.640 13:31:24 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:36.640 [2024-07-15 13:31:24.239218] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:08:36.640 [2024-07-15 13:31:24.239270] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4156952 ] 00:08:36.900 [2024-07-15 13:31:24.325179] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.900 [2024-07-15 13:31:24.415436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.185 [2024-07-15 13:31:24.571564] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:37.185 [2024-07-15 13:31:24.571618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:37.185 [2024-07-15 13:31:24.571627] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:37.185 [2024-07-15 13:31:24.579571] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:37.185 [2024-07-15 13:31:24.579588] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:37.185 [2024-07-15 13:31:24.587581] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:37.185 [2024-07-15 13:31:24.587596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:37.185 [2024-07-15 13:31:24.660394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:37.185 [2024-07-15 13:31:24.660439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:37.185 [2024-07-15 13:31:24.660453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1edf100 00:08:37.185 [2024-07-15 13:31:24.660466] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:37.185 [2024-07-15 13:31:24.661447] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:37.185 [2024-07-15 13:31:24.661467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:37.444 Running I/O for 1 seconds... 00:08:38.379 00:08:38.379 Latency(us) 00:08:38.379 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:38.379 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc0 : 1.02 7544.43 29.47 0.00 0.00 16958.05 455.90 28151.99 00:08:38.379 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc1p0 : 1.02 7537.41 29.44 0.00 0.00 16949.78 609.06 27582.11 00:08:38.379 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc1p1 : 1.04 7543.70 29.47 0.00 0.00 16914.95 609.06 27012.23 00:08:38.379 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p0 : 1.04 7536.90 29.44 0.00 0.00 16905.67 609.06 26442.35 00:08:38.379 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p1 : 1.04 7530.20 29.41 0.00 0.00 16895.13 605.50 25872.47 00:08:38.379 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p2 : 1.04 7523.48 29.39 0.00 0.00 16889.01 605.50 25188.62 00:08:38.379 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p3 : 1.04 7516.76 29.36 0.00 0.00 16879.20 601.93 24618.74 00:08:38.379 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p4 : 1.04 7510.10 29.34 0.00 0.00 16871.21 609.06 24048.86 00:08:38.379 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p5 : 1.04 7503.45 29.31 0.00 0.00 16853.78 633.99 23365.01 00:08:38.379 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p6 : 1.04 7496.74 29.28 0.00 0.00 16843.11 598.37 22795.13 00:08:38.379 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 Malloc2p7 : 1.04 7490.10 29.26 0.00 0.00 16840.01 605.50 22225.25 00:08:38.379 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 TestPT : 1.04 7483.49 29.23 0.00 0.00 16827.43 626.87 21655.37 00:08:38.379 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 raid0 : 1.04 7475.82 29.20 0.00 0.00 16812.01 1047.15 20629.59 00:08:38.379 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 concat0 : 1.05 7468.25 29.17 0.00 0.00 16788.33 1061.40 19489.84 00:08:38.379 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 raid1 : 1.05 7458.87 29.14 0.00 0.00 16762.51 1702.51 17780.20 00:08:38.379 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:38.379 AIO0 : 1.05 7453.14 29.11 0.00 0.00 16717.27 705.22 17666.23 00:08:38.379 =================================================================================================================== 00:08:38.379 Total : 120072.86 469.03 0.00 0.00 16856.52 455.90 28151.99 00:08:38.638 00:08:38.638 real 0m2.050s 00:08:38.638 user 0m1.685s 00:08:38.638 sys 0m0.307s 00:08:38.638 13:31:26 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:38.638 13:31:26 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:08:38.638 ************************************ 00:08:38.638 END TEST bdev_write_zeroes 00:08:38.638 ************************************ 00:08:38.897 13:31:26 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:38.897 13:31:26 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:38.897 13:31:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:38.897 13:31:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:38.897 13:31:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.897 ************************************ 00:08:38.897 START TEST bdev_json_nonenclosed 00:08:38.897 ************************************ 00:08:38.897 13:31:26 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:38.897 [2024-07-15 13:31:26.377197] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:08:38.897 [2024-07-15 13:31:26.377248] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4157182 ] 00:08:38.897 [2024-07-15 13:31:26.465905] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.157 [2024-07-15 13:31:26.555240] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.157 [2024-07-15 13:31:26.555310] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:08:39.157 [2024-07-15 13:31:26.555327] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:39.157 [2024-07-15 13:31:26.555336] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:39.157 00:08:39.157 real 0m0.326s 00:08:39.157 user 0m0.197s 00:08:39.157 sys 0m0.126s 00:08:39.157 13:31:26 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:08:39.157 13:31:26 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.157 13:31:26 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:08:39.157 ************************************ 00:08:39.157 END TEST bdev_json_nonenclosed 00:08:39.157 ************************************ 00:08:39.157 13:31:26 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:08:39.157 13:31:26 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:08:39.157 13:31:26 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:39.157 13:31:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:39.157 13:31:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.157 13:31:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:39.157 ************************************ 00:08:39.157 START TEST bdev_json_nonarray 00:08:39.157 ************************************ 00:08:39.157 13:31:26 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:39.417 [2024-07-15 13:31:26.791066] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:08:39.417 [2024-07-15 13:31:26.791121] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4157348 ] 00:08:39.417 [2024-07-15 13:31:26.876450] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.417 [2024-07-15 13:31:26.963970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.418 [2024-07-15 13:31:26.964046] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:08:39.418 [2024-07-15 13:31:26.964061] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:39.418 [2024-07-15 13:31:26.964071] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:39.677 00:08:39.677 real 0m0.319s 00:08:39.677 user 0m0.185s 00:08:39.677 sys 0m0.132s 00:08:39.677 13:31:27 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:08:39.677 13:31:27 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.677 13:31:27 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:08:39.677 ************************************ 00:08:39.677 END TEST bdev_json_nonarray 00:08:39.677 ************************************ 00:08:39.677 13:31:27 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:08:39.677 13:31:27 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:08:39.677 13:31:27 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:08:39.677 13:31:27 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:08:39.677 13:31:27 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:39.677 13:31:27 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.677 13:31:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:39.677 ************************************ 00:08:39.677 START TEST bdev_qos 00:08:39.677 ************************************ 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=4157369 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 4157369' 00:08:39.677 Process qos testing pid: 4157369 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 4157369 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 4157369 ']' 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:39.677 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:39.677 13:31:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:39.677 [2024-07-15 13:31:27.201737] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:08:39.677 [2024-07-15 13:31:27.201792] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4157369 ] 00:08:39.677 [2024-07-15 13:31:27.290380] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.936 [2024-07-15 13:31:27.381163] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:40.506 Malloc_0 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:40.506 [ 00:08:40.506 { 00:08:40.506 "name": "Malloc_0", 00:08:40.506 "aliases": [ 00:08:40.506 "cedcc299-a515-4691-884f-6702986415a4" 00:08:40.506 ], 00:08:40.506 "product_name": "Malloc disk", 00:08:40.506 "block_size": 512, 00:08:40.506 "num_blocks": 262144, 00:08:40.506 "uuid": "cedcc299-a515-4691-884f-6702986415a4", 00:08:40.506 "assigned_rate_limits": { 00:08:40.506 "rw_ios_per_sec": 0, 00:08:40.506 "rw_mbytes_per_sec": 0, 00:08:40.506 "r_mbytes_per_sec": 0, 00:08:40.506 "w_mbytes_per_sec": 0 00:08:40.506 }, 00:08:40.506 "claimed": false, 00:08:40.506 "zoned": false, 00:08:40.506 "supported_io_types": { 00:08:40.506 "read": true, 00:08:40.506 "write": true, 00:08:40.506 "unmap": true, 00:08:40.506 "flush": true, 00:08:40.506 "reset": true, 00:08:40.506 "nvme_admin": false, 00:08:40.506 "nvme_io": false, 00:08:40.506 "nvme_io_md": false, 00:08:40.506 "write_zeroes": true, 00:08:40.506 "zcopy": true, 00:08:40.506 "get_zone_info": false, 00:08:40.506 "zone_management": false, 00:08:40.506 "zone_append": false, 00:08:40.506 "compare": false, 00:08:40.506 "compare_and_write": false, 00:08:40.506 "abort": true, 00:08:40.506 "seek_hole": false, 00:08:40.506 "seek_data": false, 00:08:40.506 "copy": true, 00:08:40.506 "nvme_iov_md": false 00:08:40.506 }, 00:08:40.506 "memory_domains": [ 00:08:40.506 { 00:08:40.506 "dma_device_id": "system", 00:08:40.506 "dma_device_type": 1 00:08:40.506 }, 00:08:40.506 { 00:08:40.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:40.506 "dma_device_type": 2 00:08:40.506 } 00:08:40.506 ], 00:08:40.506 "driver_specific": {} 00:08:40.506 } 00:08:40.506 ] 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:40.506 Null_1 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:08:40.506 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:40.507 [ 00:08:40.507 { 00:08:40.507 "name": "Null_1", 00:08:40.507 "aliases": [ 00:08:40.507 "e4acd743-cb43-419c-a3fa-c7864b1585fa" 00:08:40.507 ], 00:08:40.507 "product_name": "Null disk", 00:08:40.507 "block_size": 512, 00:08:40.507 "num_blocks": 262144, 00:08:40.507 "uuid": "e4acd743-cb43-419c-a3fa-c7864b1585fa", 00:08:40.507 "assigned_rate_limits": { 00:08:40.507 "rw_ios_per_sec": 0, 00:08:40.507 "rw_mbytes_per_sec": 0, 00:08:40.507 "r_mbytes_per_sec": 0, 00:08:40.507 "w_mbytes_per_sec": 0 00:08:40.507 }, 00:08:40.507 "claimed": false, 00:08:40.507 "zoned": false, 00:08:40.507 "supported_io_types": { 00:08:40.507 "read": true, 00:08:40.507 "write": true, 00:08:40.507 "unmap": false, 00:08:40.507 "flush": false, 00:08:40.507 "reset": true, 00:08:40.507 "nvme_admin": false, 00:08:40.507 "nvme_io": false, 00:08:40.507 "nvme_io_md": false, 00:08:40.507 "write_zeroes": true, 00:08:40.507 "zcopy": false, 00:08:40.507 "get_zone_info": false, 00:08:40.507 "zone_management": false, 00:08:40.507 "zone_append": false, 00:08:40.507 "compare": false, 00:08:40.507 "compare_and_write": false, 00:08:40.507 "abort": true, 00:08:40.507 "seek_hole": false, 00:08:40.507 "seek_data": false, 00:08:40.507 "copy": false, 00:08:40.507 "nvme_iov_md": false 00:08:40.507 }, 00:08:40.507 "driver_specific": {} 00:08:40.507 } 00:08:40.507 ] 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:40.507 13:31:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:40.766 Running I/O for 60 seconds... 00:08:46.039 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 94073.94 376295.75 0.00 0.00 378880.00 0.00 0.00 ' 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=94073.94 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 94073 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=94073 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=23000 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 23000 -gt 1000 ']' 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 23000 Malloc_0 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 23000 IOPS Malloc_0 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:46.040 13:31:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:46.040 ************************************ 00:08:46.040 START TEST bdev_qos_iops 00:08:46.040 ************************************ 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 23000 IOPS Malloc_0 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=23000 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:46.040 13:31:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 23005.88 92023.54 0.00 0.00 92736.00 0.00 0.00 ' 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=23005.88 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 23005 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=23005 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=20700 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=25300 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 23005 -lt 20700 ']' 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 23005 -gt 25300 ']' 00:08:51.301 00:08:51.301 real 0m5.175s 00:08:51.301 user 0m0.088s 00:08:51.301 sys 0m0.041s 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:51.301 13:31:38 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:08:51.301 ************************************ 00:08:51.301 END TEST bdev_qos_iops 00:08:51.301 ************************************ 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:51.301 13:31:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 33490.03 133960.14 0.00 0.00 135168.00 0.00 0.00 ' 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=135168.00 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 135168 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=135168 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=13 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 13 -lt 2 ']' 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 13 Null_1 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 13 BANDWIDTH Null_1 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:56.604 13:31:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:56.604 ************************************ 00:08:56.604 START TEST bdev_qos_bw 00:08:56.604 ************************************ 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 13 BANDWIDTH Null_1 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=13 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:56.604 13:31:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 3326.66 13306.62 0.00 0.00 13524.00 0.00 0.00 ' 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=13524.00 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 13524 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=13524 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=13312 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=11980 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=14643 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 13524 -lt 11980 ']' 00:09:01.881 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 13524 -gt 14643 ']' 00:09:01.882 00:09:01.882 real 0m5.176s 00:09:01.882 user 0m0.078s 00:09:01.882 sys 0m0.032s 00:09:01.882 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:01.882 13:31:48 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:01.882 ************************************ 00:09:01.882 END TEST bdev_qos_bw 00:09:01.882 ************************************ 00:09:01.882 13:31:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:01.882 13:31:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:01.882 13:31:48 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.882 13:31:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:01.882 13:31:49 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.882 13:31:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:01.882 13:31:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:01.882 13:31:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.882 13:31:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:01.882 ************************************ 00:09:01.882 START TEST bdev_qos_ro_bw 00:09:01.882 ************************************ 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:01.882 13:31:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 512.29 2049.15 0.00 0.00 2060.00 0.00 0.00 ' 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:09:07.150 00:09:07.150 real 0m5.154s 00:09:07.150 user 0m0.083s 00:09:07.150 sys 0m0.045s 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.150 13:31:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:09:07.150 ************************************ 00:09:07.150 END TEST bdev_qos_ro_bw 00:09:07.150 ************************************ 00:09:07.150 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:07.150 13:31:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:09:07.150 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:07.150 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:07.408 00:09:07.408 Latency(us) 00:09:07.408 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:07.408 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:07.408 Malloc_0 : 26.51 32173.43 125.68 0.00 0.00 7879.64 1317.84 503316.48 00:09:07.408 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:07.408 Null_1 : 26.61 32646.82 127.53 0.00 0.00 7828.60 569.88 95739.55 00:09:07.408 =================================================================================================================== 00:09:07.408 Total : 64820.24 253.20 0.00 0.00 7853.89 569.88 503316.48 00:09:07.408 0 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 4157369 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 4157369 ']' 00:09:07.408 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 4157369 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4157369 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4157369' 00:09:07.409 killing process with pid 4157369 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 4157369 00:09:07.409 Received shutdown signal, test time was about 26.668355 seconds 00:09:07.409 00:09:07.409 Latency(us) 00:09:07.409 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:07.409 =================================================================================================================== 00:09:07.409 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:07.409 13:31:54 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 4157369 00:09:07.667 13:31:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:09:07.667 00:09:07.667 real 0m27.965s 00:09:07.667 user 0m28.500s 00:09:07.667 sys 0m0.744s 00:09:07.667 13:31:55 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.667 13:31:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:07.667 ************************************ 00:09:07.667 END TEST bdev_qos 00:09:07.667 ************************************ 00:09:07.667 13:31:55 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:07.667 13:31:55 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:09:07.667 13:31:55 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:07.667 13:31:55 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.667 13:31:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:07.667 ************************************ 00:09:07.667 START TEST bdev_qd_sampling 00:09:07.667 ************************************ 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=4161164 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 4161164' 00:09:07.667 Process bdev QD sampling period testing pid: 4161164 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 4161164 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 4161164 ']' 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:07.667 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:07.667 13:31:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:07.667 [2024-07-15 13:31:55.226619] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:07.667 [2024-07-15 13:31:55.226664] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4161164 ] 00:09:07.924 [2024-07-15 13:31:55.314992] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:07.924 [2024-07-15 13:31:55.408519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:07.924 [2024-07-15 13:31:55.408522] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.489 Malloc_QD 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:08.489 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.489 [ 00:09:08.489 { 00:09:08.489 "name": "Malloc_QD", 00:09:08.489 "aliases": [ 00:09:08.489 "28082a18-9e48-49a8-adfa-2b96b1da6a2a" 00:09:08.489 ], 00:09:08.489 "product_name": "Malloc disk", 00:09:08.489 "block_size": 512, 00:09:08.489 "num_blocks": 262144, 00:09:08.489 "uuid": "28082a18-9e48-49a8-adfa-2b96b1da6a2a", 00:09:08.489 "assigned_rate_limits": { 00:09:08.489 "rw_ios_per_sec": 0, 00:09:08.489 "rw_mbytes_per_sec": 0, 00:09:08.489 "r_mbytes_per_sec": 0, 00:09:08.489 "w_mbytes_per_sec": 0 00:09:08.489 }, 00:09:08.489 "claimed": false, 00:09:08.489 "zoned": false, 00:09:08.489 "supported_io_types": { 00:09:08.489 "read": true, 00:09:08.489 "write": true, 00:09:08.489 "unmap": true, 00:09:08.489 "flush": true, 00:09:08.489 "reset": true, 00:09:08.489 "nvme_admin": false, 00:09:08.490 "nvme_io": false, 00:09:08.490 "nvme_io_md": false, 00:09:08.490 "write_zeroes": true, 00:09:08.490 "zcopy": true, 00:09:08.490 "get_zone_info": false, 00:09:08.490 "zone_management": false, 00:09:08.490 "zone_append": false, 00:09:08.490 "compare": false, 00:09:08.490 "compare_and_write": false, 00:09:08.490 "abort": true, 00:09:08.490 "seek_hole": false, 00:09:08.490 "seek_data": false, 00:09:08.490 "copy": true, 00:09:08.490 "nvme_iov_md": false 00:09:08.490 }, 00:09:08.490 "memory_domains": [ 00:09:08.490 { 00:09:08.490 "dma_device_id": "system", 00:09:08.490 "dma_device_type": 1 00:09:08.490 }, 00:09:08.490 { 00:09:08.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:08.490 "dma_device_type": 2 00:09:08.490 } 00:09:08.490 ], 00:09:08.490 "driver_specific": {} 00:09:08.490 } 00:09:08.490 ] 00:09:08.490 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:08.490 13:31:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:09:08.490 13:31:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:09:08.490 13:31:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:08.747 Running I/O for 5 seconds... 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:09:10.646 "tick_rate": 2300000000, 00:09:10.646 "ticks": 11142497493654976, 00:09:10.646 "bdevs": [ 00:09:10.646 { 00:09:10.646 "name": "Malloc_QD", 00:09:10.646 "bytes_read": 1014018560, 00:09:10.646 "num_read_ops": 247556, 00:09:10.646 "bytes_written": 0, 00:09:10.646 "num_write_ops": 0, 00:09:10.646 "bytes_unmapped": 0, 00:09:10.646 "num_unmap_ops": 0, 00:09:10.646 "bytes_copied": 0, 00:09:10.646 "num_copy_ops": 0, 00:09:10.646 "read_latency_ticks": 2271251218192, 00:09:10.646 "max_read_latency_ticks": 11336006, 00:09:10.646 "min_read_latency_ticks": 208448, 00:09:10.646 "write_latency_ticks": 0, 00:09:10.646 "max_write_latency_ticks": 0, 00:09:10.646 "min_write_latency_ticks": 0, 00:09:10.646 "unmap_latency_ticks": 0, 00:09:10.646 "max_unmap_latency_ticks": 0, 00:09:10.646 "min_unmap_latency_ticks": 0, 00:09:10.646 "copy_latency_ticks": 0, 00:09:10.646 "max_copy_latency_ticks": 0, 00:09:10.646 "min_copy_latency_ticks": 0, 00:09:10.646 "io_error": {}, 00:09:10.646 "queue_depth_polling_period": 10, 00:09:10.646 "queue_depth": 512, 00:09:10.646 "io_time": 30, 00:09:10.646 "weighted_io_time": 15360 00:09:10.646 } 00:09:10.646 ] 00:09:10.646 }' 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:10.646 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.646 00:09:10.646 Latency(us) 00:09:10.646 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:10.646 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:10.646 Malloc_QD : 2.00 62817.08 245.38 0.00 0.00 4066.36 1089.89 4473.54 00:09:10.646 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:10.647 Malloc_QD : 2.00 65363.21 255.33 0.00 0.00 3908.47 740.84 4929.45 00:09:10.647 =================================================================================================================== 00:09:10.647 Total : 128180.29 500.70 0.00 0.00 3985.84 740.84 4929.45 00:09:10.647 0 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 4161164 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 4161164 ']' 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 4161164 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4161164 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4161164' 00:09:10.647 killing process with pid 4161164 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 4161164 00:09:10.647 Received shutdown signal, test time was about 2.072249 seconds 00:09:10.647 00:09:10.647 Latency(us) 00:09:10.647 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:10.647 =================================================================================================================== 00:09:10.647 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:10.647 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 4161164 00:09:10.905 13:31:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:09:10.905 00:09:10.905 real 0m3.274s 00:09:10.905 user 0m6.391s 00:09:10.905 sys 0m0.372s 00:09:10.905 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:10.905 13:31:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.905 ************************************ 00:09:10.905 END TEST bdev_qd_sampling 00:09:10.905 ************************************ 00:09:10.905 13:31:58 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:10.905 13:31:58 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:09:10.905 13:31:58 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:10.905 13:31:58 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.905 13:31:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:11.163 ************************************ 00:09:11.163 START TEST bdev_error 00:09:11.163 ************************************ 00:09:11.163 13:31:58 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:09:11.163 13:31:58 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:09:11.163 13:31:58 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:09:11.163 13:31:58 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:09:11.163 13:31:58 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:09:11.163 13:31:58 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=4161572 00:09:11.163 13:31:58 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 4161572' 00:09:11.163 Process error testing pid: 4161572 00:09:11.163 13:31:58 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 4161572 00:09:11.163 13:31:58 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 4161572 ']' 00:09:11.163 13:31:58 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:11.163 13:31:58 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:11.163 13:31:58 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:11.163 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:11.163 13:31:58 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:11.163 13:31:58 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.163 [2024-07-15 13:31:58.580680] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:11.163 [2024-07-15 13:31:58.580731] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4161572 ] 00:09:11.163 [2024-07-15 13:31:58.666460] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.163 [2024-07-15 13:31:58.749100] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:09:12.107 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.107 Dev_1 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.107 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.107 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.107 [ 00:09:12.107 { 00:09:12.107 "name": "Dev_1", 00:09:12.107 "aliases": [ 00:09:12.107 "64b8f9ff-19c8-43f0-96f1-ff3baca5ba79" 00:09:12.107 ], 00:09:12.107 "product_name": "Malloc disk", 00:09:12.107 "block_size": 512, 00:09:12.107 "num_blocks": 262144, 00:09:12.107 "uuid": "64b8f9ff-19c8-43f0-96f1-ff3baca5ba79", 00:09:12.107 "assigned_rate_limits": { 00:09:12.107 "rw_ios_per_sec": 0, 00:09:12.107 "rw_mbytes_per_sec": 0, 00:09:12.107 "r_mbytes_per_sec": 0, 00:09:12.107 "w_mbytes_per_sec": 0 00:09:12.107 }, 00:09:12.107 "claimed": false, 00:09:12.107 "zoned": false, 00:09:12.107 "supported_io_types": { 00:09:12.107 "read": true, 00:09:12.107 "write": true, 00:09:12.108 "unmap": true, 00:09:12.108 "flush": true, 00:09:12.108 "reset": true, 00:09:12.108 "nvme_admin": false, 00:09:12.108 "nvme_io": false, 00:09:12.108 "nvme_io_md": false, 00:09:12.108 "write_zeroes": true, 00:09:12.108 "zcopy": true, 00:09:12.108 "get_zone_info": false, 00:09:12.108 "zone_management": false, 00:09:12.108 "zone_append": false, 00:09:12.108 "compare": false, 00:09:12.108 "compare_and_write": false, 00:09:12.108 "abort": true, 00:09:12.108 "seek_hole": false, 00:09:12.108 "seek_data": false, 00:09:12.108 "copy": true, 00:09:12.108 "nvme_iov_md": false 00:09:12.108 }, 00:09:12.108 "memory_domains": [ 00:09:12.108 { 00:09:12.108 "dma_device_id": "system", 00:09:12.108 "dma_device_type": 1 00:09:12.108 }, 00:09:12.108 { 00:09:12.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:12.108 "dma_device_type": 2 00:09:12.108 } 00:09:12.108 ], 00:09:12.108 "driver_specific": {} 00:09:12.108 } 00:09:12.108 ] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:12.108 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.108 true 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.108 Dev_2 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.108 [ 00:09:12.108 { 00:09:12.108 "name": "Dev_2", 00:09:12.108 "aliases": [ 00:09:12.108 "795bc1cc-2691-4d5c-a597-50e59f12a27e" 00:09:12.108 ], 00:09:12.108 "product_name": "Malloc disk", 00:09:12.108 "block_size": 512, 00:09:12.108 "num_blocks": 262144, 00:09:12.108 "uuid": "795bc1cc-2691-4d5c-a597-50e59f12a27e", 00:09:12.108 "assigned_rate_limits": { 00:09:12.108 "rw_ios_per_sec": 0, 00:09:12.108 "rw_mbytes_per_sec": 0, 00:09:12.108 "r_mbytes_per_sec": 0, 00:09:12.108 "w_mbytes_per_sec": 0 00:09:12.108 }, 00:09:12.108 "claimed": false, 00:09:12.108 "zoned": false, 00:09:12.108 "supported_io_types": { 00:09:12.108 "read": true, 00:09:12.108 "write": true, 00:09:12.108 "unmap": true, 00:09:12.108 "flush": true, 00:09:12.108 "reset": true, 00:09:12.108 "nvme_admin": false, 00:09:12.108 "nvme_io": false, 00:09:12.108 "nvme_io_md": false, 00:09:12.108 "write_zeroes": true, 00:09:12.108 "zcopy": true, 00:09:12.108 "get_zone_info": false, 00:09:12.108 "zone_management": false, 00:09:12.108 "zone_append": false, 00:09:12.108 "compare": false, 00:09:12.108 "compare_and_write": false, 00:09:12.108 "abort": true, 00:09:12.108 "seek_hole": false, 00:09:12.108 "seek_data": false, 00:09:12.108 "copy": true, 00:09:12.108 "nvme_iov_md": false 00:09:12.108 }, 00:09:12.108 "memory_domains": [ 00:09:12.108 { 00:09:12.108 "dma_device_id": "system", 00:09:12.108 "dma_device_type": 1 00:09:12.108 }, 00:09:12.108 { 00:09:12.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:12.108 "dma_device_type": 2 00:09:12.108 } 00:09:12.108 ], 00:09:12.108 "driver_specific": {} 00:09:12.108 } 00:09:12.108 ] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:12.108 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.108 13:31:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:12.108 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:09:12.108 13:31:59 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:12.108 Running I/O for 5 seconds... 00:09:13.042 13:32:00 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 4161572 00:09:13.042 13:32:00 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 4161572' 00:09:13.042 Process is existed as continue on error is set. Pid: 4161572 00:09:13.042 13:32:00 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:09:13.042 13:32:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:13.042 13:32:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:13.042 13:32:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:13.042 13:32:00 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:09:13.042 13:32:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:13.042 13:32:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:13.042 13:32:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:13.042 13:32:00 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:09:13.042 Timeout while waiting for response: 00:09:13.042 00:09:13.042 00:09:17.228 00:09:17.228 Latency(us) 00:09:17.228 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:17.228 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:17.228 EE_Dev_1 : 0.93 57962.13 226.41 5.39 0.00 273.84 97.06 562.75 00:09:17.228 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:17.228 Dev_2 : 5.00 126157.84 492.80 0.00 0.00 124.62 41.41 21769.35 00:09:17.228 =================================================================================================================== 00:09:17.228 Total : 184119.97 719.22 5.39 0.00 136.34 41.41 21769.35 00:09:18.163 13:32:05 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 4161572 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 4161572 ']' 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 4161572 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4161572 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4161572' 00:09:18.163 killing process with pid 4161572 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 4161572 00:09:18.163 Received shutdown signal, test time was about 5.000000 seconds 00:09:18.163 00:09:18.163 Latency(us) 00:09:18.163 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:18.163 =================================================================================================================== 00:09:18.163 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:18.163 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 4161572 00:09:18.421 13:32:05 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=4162604 00:09:18.421 13:32:05 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 4162604' 00:09:18.421 13:32:05 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:09:18.421 Process error testing pid: 4162604 00:09:18.421 13:32:05 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 4162604 00:09:18.421 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 4162604 ']' 00:09:18.421 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:18.421 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:18.421 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:18.421 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:18.421 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:18.421 13:32:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:18.421 [2024-07-15 13:32:05.962436] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:18.421 [2024-07-15 13:32:05.962488] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4162604 ] 00:09:18.679 [2024-07-15 13:32:06.050033] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:18.679 [2024-07-15 13:32:06.140508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:09:19.248 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.248 Dev_1 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.248 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.248 [ 00:09:19.248 { 00:09:19.248 "name": "Dev_1", 00:09:19.248 "aliases": [ 00:09:19.248 "539543d3-ca2a-4abc-bb39-c774880c8213" 00:09:19.248 ], 00:09:19.248 "product_name": "Malloc disk", 00:09:19.248 "block_size": 512, 00:09:19.248 "num_blocks": 262144, 00:09:19.248 "uuid": "539543d3-ca2a-4abc-bb39-c774880c8213", 00:09:19.248 "assigned_rate_limits": { 00:09:19.248 "rw_ios_per_sec": 0, 00:09:19.248 "rw_mbytes_per_sec": 0, 00:09:19.248 "r_mbytes_per_sec": 0, 00:09:19.248 "w_mbytes_per_sec": 0 00:09:19.248 }, 00:09:19.248 "claimed": false, 00:09:19.248 "zoned": false, 00:09:19.248 "supported_io_types": { 00:09:19.248 "read": true, 00:09:19.248 "write": true, 00:09:19.248 "unmap": true, 00:09:19.248 "flush": true, 00:09:19.248 "reset": true, 00:09:19.248 "nvme_admin": false, 00:09:19.248 "nvme_io": false, 00:09:19.248 "nvme_io_md": false, 00:09:19.248 "write_zeroes": true, 00:09:19.248 "zcopy": true, 00:09:19.248 "get_zone_info": false, 00:09:19.248 "zone_management": false, 00:09:19.248 "zone_append": false, 00:09:19.248 "compare": false, 00:09:19.248 "compare_and_write": false, 00:09:19.248 "abort": true, 00:09:19.248 "seek_hole": false, 00:09:19.248 "seek_data": false, 00:09:19.248 "copy": true, 00:09:19.248 "nvme_iov_md": false 00:09:19.248 }, 00:09:19.248 "memory_domains": [ 00:09:19.248 { 00:09:19.248 "dma_device_id": "system", 00:09:19.248 "dma_device_type": 1 00:09:19.248 }, 00:09:19.248 { 00:09:19.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.248 "dma_device_type": 2 00:09:19.248 } 00:09:19.248 ], 00:09:19.248 "driver_specific": {} 00:09:19.248 } 00:09:19.248 ] 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:19.248 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.248 true 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.248 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.248 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.505 Dev_2 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.505 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:19.505 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.506 [ 00:09:19.506 { 00:09:19.506 "name": "Dev_2", 00:09:19.506 "aliases": [ 00:09:19.506 "c9abd977-1aa5-4aea-aff8-28545c5c6dd5" 00:09:19.506 ], 00:09:19.506 "product_name": "Malloc disk", 00:09:19.506 "block_size": 512, 00:09:19.506 "num_blocks": 262144, 00:09:19.506 "uuid": "c9abd977-1aa5-4aea-aff8-28545c5c6dd5", 00:09:19.506 "assigned_rate_limits": { 00:09:19.506 "rw_ios_per_sec": 0, 00:09:19.506 "rw_mbytes_per_sec": 0, 00:09:19.506 "r_mbytes_per_sec": 0, 00:09:19.506 "w_mbytes_per_sec": 0 00:09:19.506 }, 00:09:19.506 "claimed": false, 00:09:19.506 "zoned": false, 00:09:19.506 "supported_io_types": { 00:09:19.506 "read": true, 00:09:19.506 "write": true, 00:09:19.506 "unmap": true, 00:09:19.506 "flush": true, 00:09:19.506 "reset": true, 00:09:19.506 "nvme_admin": false, 00:09:19.506 "nvme_io": false, 00:09:19.506 "nvme_io_md": false, 00:09:19.506 "write_zeroes": true, 00:09:19.506 "zcopy": true, 00:09:19.506 "get_zone_info": false, 00:09:19.506 "zone_management": false, 00:09:19.506 "zone_append": false, 00:09:19.506 "compare": false, 00:09:19.506 "compare_and_write": false, 00:09:19.506 "abort": true, 00:09:19.506 "seek_hole": false, 00:09:19.506 "seek_data": false, 00:09:19.506 "copy": true, 00:09:19.506 "nvme_iov_md": false 00:09:19.506 }, 00:09:19.506 "memory_domains": [ 00:09:19.506 { 00:09:19.506 "dma_device_id": "system", 00:09:19.506 "dma_device_type": 1 00:09:19.506 }, 00:09:19.506 { 00:09:19.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.506 "dma_device_type": 2 00:09:19.506 } 00:09:19.506 ], 00:09:19.506 "driver_specific": {} 00:09:19.506 } 00:09:19.506 ] 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:19.506 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.506 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 4162604 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 4162604 00:09:19.506 13:32:06 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:19.506 13:32:06 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 4162604 00:09:19.506 Running I/O for 5 seconds... 00:09:19.506 task offset: 166752 on job bdev=EE_Dev_1 fails 00:09:19.506 00:09:19.506 Latency(us) 00:09:19.506 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:19.506 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:19.506 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:09:19.506 EE_Dev_1 : 0.00 46315.79 180.92 10526.32 0.00 231.65 93.94 416.72 00:09:19.506 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:19.506 Dev_2 : 0.00 29038.11 113.43 0.00 0.00 404.62 81.92 751.53 00:09:19.506 =================================================================================================================== 00:09:19.506 Total : 75353.90 294.35 10526.32 0.00 325.47 81.92 751.53 00:09:19.506 [2024-07-15 13:32:07.007229] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:19.506 request: 00:09:19.506 { 00:09:19.506 "method": "perform_tests", 00:09:19.506 "req_id": 1 00:09:19.506 } 00:09:19.506 Got JSON-RPC error response 00:09:19.506 response: 00:09:19.506 { 00:09:19.506 "code": -32603, 00:09:19.506 "message": "bdevperf failed with error Operation not permitted" 00:09:19.506 } 00:09:19.763 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:09:19.763 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:19.763 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:09:19.822 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:09:19.822 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:09:19.822 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:19.822 00:09:19.822 real 0m8.731s 00:09:19.822 user 0m8.978s 00:09:19.822 sys 0m0.752s 00:09:19.822 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:19.822 13:32:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.822 ************************************ 00:09:19.822 END TEST bdev_error 00:09:19.822 ************************************ 00:09:19.822 13:32:07 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:19.822 13:32:07 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:09:19.822 13:32:07 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:19.822 13:32:07 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.822 13:32:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:19.822 ************************************ 00:09:19.822 START TEST bdev_stat 00:09:19.822 ************************************ 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=4162801 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 4162801' 00:09:19.822 Process Bdev IO statistics testing pid: 4162801 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 4162801 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 4162801 ']' 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:19.822 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:19.822 13:32:07 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.079 [2024-07-15 13:32:07.405730] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:20.080 [2024-07-15 13:32:07.405797] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4162801 ] 00:09:20.080 [2024-07-15 13:32:07.495174] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:20.080 [2024-07-15 13:32:07.579612] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:20.080 [2024-07-15 13:32:07.579615] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.647 Malloc_STAT 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.647 [ 00:09:20.647 { 00:09:20.647 "name": "Malloc_STAT", 00:09:20.647 "aliases": [ 00:09:20.647 "6b557f28-e896-4e80-b3ce-12cae76844b1" 00:09:20.647 ], 00:09:20.647 "product_name": "Malloc disk", 00:09:20.647 "block_size": 512, 00:09:20.647 "num_blocks": 262144, 00:09:20.647 "uuid": "6b557f28-e896-4e80-b3ce-12cae76844b1", 00:09:20.647 "assigned_rate_limits": { 00:09:20.647 "rw_ios_per_sec": 0, 00:09:20.647 "rw_mbytes_per_sec": 0, 00:09:20.647 "r_mbytes_per_sec": 0, 00:09:20.647 "w_mbytes_per_sec": 0 00:09:20.647 }, 00:09:20.647 "claimed": false, 00:09:20.647 "zoned": false, 00:09:20.647 "supported_io_types": { 00:09:20.647 "read": true, 00:09:20.647 "write": true, 00:09:20.647 "unmap": true, 00:09:20.647 "flush": true, 00:09:20.647 "reset": true, 00:09:20.647 "nvme_admin": false, 00:09:20.647 "nvme_io": false, 00:09:20.647 "nvme_io_md": false, 00:09:20.647 "write_zeroes": true, 00:09:20.647 "zcopy": true, 00:09:20.647 "get_zone_info": false, 00:09:20.647 "zone_management": false, 00:09:20.647 "zone_append": false, 00:09:20.647 "compare": false, 00:09:20.647 "compare_and_write": false, 00:09:20.647 "abort": true, 00:09:20.647 "seek_hole": false, 00:09:20.647 "seek_data": false, 00:09:20.647 "copy": true, 00:09:20.647 "nvme_iov_md": false 00:09:20.647 }, 00:09:20.647 "memory_domains": [ 00:09:20.647 { 00:09:20.647 "dma_device_id": "system", 00:09:20.647 "dma_device_type": 1 00:09:20.647 }, 00:09:20.647 { 00:09:20.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:20.647 "dma_device_type": 2 00:09:20.647 } 00:09:20.647 ], 00:09:20.647 "driver_specific": {} 00:09:20.647 } 00:09:20.647 ] 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.647 13:32:08 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:09:20.904 13:32:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:09:20.904 13:32:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:20.904 Running I/O for 10 seconds... 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:09:22.804 "tick_rate": 2300000000, 00:09:22.804 "ticks": 11142525380437462, 00:09:22.804 "bdevs": [ 00:09:22.804 { 00:09:22.804 "name": "Malloc_STAT", 00:09:22.804 "bytes_read": 1010872832, 00:09:22.804 "num_read_ops": 246788, 00:09:22.804 "bytes_written": 0, 00:09:22.804 "num_write_ops": 0, 00:09:22.804 "bytes_unmapped": 0, 00:09:22.804 "num_unmap_ops": 0, 00:09:22.804 "bytes_copied": 0, 00:09:22.804 "num_copy_ops": 0, 00:09:22.804 "read_latency_ticks": 2256342522604, 00:09:22.804 "max_read_latency_ticks": 13132518, 00:09:22.804 "min_read_latency_ticks": 205420, 00:09:22.804 "write_latency_ticks": 0, 00:09:22.804 "max_write_latency_ticks": 0, 00:09:22.804 "min_write_latency_ticks": 0, 00:09:22.804 "unmap_latency_ticks": 0, 00:09:22.804 "max_unmap_latency_ticks": 0, 00:09:22.804 "min_unmap_latency_ticks": 0, 00:09:22.804 "copy_latency_ticks": 0, 00:09:22.804 "max_copy_latency_ticks": 0, 00:09:22.804 "min_copy_latency_ticks": 0, 00:09:22.804 "io_error": {} 00:09:22.804 } 00:09:22.804 ] 00:09:22.804 }' 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=246788 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:09:22.804 "tick_rate": 2300000000, 00:09:22.804 "ticks": 11142525520787080, 00:09:22.804 "name": "Malloc_STAT", 00:09:22.804 "channels": [ 00:09:22.804 { 00:09:22.804 "thread_id": 2, 00:09:22.804 "bytes_read": 516947968, 00:09:22.804 "num_read_ops": 126208, 00:09:22.804 "bytes_written": 0, 00:09:22.804 "num_write_ops": 0, 00:09:22.804 "bytes_unmapped": 0, 00:09:22.804 "num_unmap_ops": 0, 00:09:22.804 "bytes_copied": 0, 00:09:22.804 "num_copy_ops": 0, 00:09:22.804 "read_latency_ticks": 1163798373926, 00:09:22.804 "max_read_latency_ticks": 12134390, 00:09:22.804 "min_read_latency_ticks": 5879172, 00:09:22.804 "write_latency_ticks": 0, 00:09:22.804 "max_write_latency_ticks": 0, 00:09:22.804 "min_write_latency_ticks": 0, 00:09:22.804 "unmap_latency_ticks": 0, 00:09:22.804 "max_unmap_latency_ticks": 0, 00:09:22.804 "min_unmap_latency_ticks": 0, 00:09:22.804 "copy_latency_ticks": 0, 00:09:22.804 "max_copy_latency_ticks": 0, 00:09:22.804 "min_copy_latency_ticks": 0 00:09:22.804 }, 00:09:22.804 { 00:09:22.804 "thread_id": 3, 00:09:22.804 "bytes_read": 526385152, 00:09:22.804 "num_read_ops": 128512, 00:09:22.804 "bytes_written": 0, 00:09:22.804 "num_write_ops": 0, 00:09:22.804 "bytes_unmapped": 0, 00:09:22.804 "num_unmap_ops": 0, 00:09:22.804 "bytes_copied": 0, 00:09:22.804 "num_copy_ops": 0, 00:09:22.804 "read_latency_ticks": 1166058236938, 00:09:22.804 "max_read_latency_ticks": 13132518, 00:09:22.804 "min_read_latency_ticks": 5891154, 00:09:22.804 "write_latency_ticks": 0, 00:09:22.804 "max_write_latency_ticks": 0, 00:09:22.804 "min_write_latency_ticks": 0, 00:09:22.804 "unmap_latency_ticks": 0, 00:09:22.804 "max_unmap_latency_ticks": 0, 00:09:22.804 "min_unmap_latency_ticks": 0, 00:09:22.804 "copy_latency_ticks": 0, 00:09:22.804 "max_copy_latency_ticks": 0, 00:09:22.804 "min_copy_latency_ticks": 0 00:09:22.804 } 00:09:22.804 ] 00:09:22.804 }' 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=126208 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=126208 00:09:22.804 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:09:23.061 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=128512 00:09:23.061 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=254720 00:09:23.061 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:23.061 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.061 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:23.061 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.061 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:09:23.061 "tick_rate": 2300000000, 00:09:23.061 "ticks": 11142525771617314, 00:09:23.062 "bdevs": [ 00:09:23.062 { 00:09:23.062 "name": "Malloc_STAT", 00:09:23.062 "bytes_read": 1100001792, 00:09:23.062 "num_read_ops": 268548, 00:09:23.062 "bytes_written": 0, 00:09:23.062 "num_write_ops": 0, 00:09:23.062 "bytes_unmapped": 0, 00:09:23.062 "num_unmap_ops": 0, 00:09:23.062 "bytes_copied": 0, 00:09:23.062 "num_copy_ops": 0, 00:09:23.062 "read_latency_ticks": 2458177950564, 00:09:23.062 "max_read_latency_ticks": 13132518, 00:09:23.062 "min_read_latency_ticks": 205420, 00:09:23.062 "write_latency_ticks": 0, 00:09:23.062 "max_write_latency_ticks": 0, 00:09:23.062 "min_write_latency_ticks": 0, 00:09:23.062 "unmap_latency_ticks": 0, 00:09:23.062 "max_unmap_latency_ticks": 0, 00:09:23.062 "min_unmap_latency_ticks": 0, 00:09:23.062 "copy_latency_ticks": 0, 00:09:23.062 "max_copy_latency_ticks": 0, 00:09:23.062 "min_copy_latency_ticks": 0, 00:09:23.062 "io_error": {} 00:09:23.062 } 00:09:23.062 ] 00:09:23.062 }' 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=268548 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 254720 -lt 246788 ']' 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 254720 -gt 268548 ']' 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:23.062 00:09:23.062 Latency(us) 00:09:23.062 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:23.062 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:23.062 Malloc_STAT : 2.15 63688.54 248.78 0.00 0.00 4011.15 1097.02 5299.87 00:09:23.062 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:23.062 Malloc_STAT : 2.16 64738.79 252.89 0.00 0.00 3946.53 662.48 5727.28 00:09:23.062 =================================================================================================================== 00:09:23.062 Total : 128427.33 501.67 0.00 0.00 3978.57 662.48 5727.28 00:09:23.062 0 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 4162801 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 4162801 ']' 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 4162801 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4162801 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4162801' 00:09:23.062 killing process with pid 4162801 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 4162801 00:09:23.062 Received shutdown signal, test time was about 2.229826 seconds 00:09:23.062 00:09:23.062 Latency(us) 00:09:23.062 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:23.062 =================================================================================================================== 00:09:23.062 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:23.062 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 4162801 00:09:23.321 13:32:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:09:23.321 00:09:23.321 real 0m3.438s 00:09:23.321 user 0m6.831s 00:09:23.321 sys 0m0.395s 00:09:23.321 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.321 13:32:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:23.321 ************************************ 00:09:23.321 END TEST bdev_stat 00:09:23.321 ************************************ 00:09:23.321 13:32:10 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:09:23.321 13:32:10 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:09:23.321 00:09:23.321 real 1m47.091s 00:09:23.321 user 6m56.639s 00:09:23.321 sys 0m18.714s 00:09:23.321 13:32:10 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.321 13:32:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:23.321 ************************************ 00:09:23.321 END TEST blockdev_general 00:09:23.321 ************************************ 00:09:23.321 13:32:10 -- common/autotest_common.sh@1142 -- # return 0 00:09:23.321 13:32:10 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:23.321 13:32:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:23.321 13:32:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.321 13:32:10 -- common/autotest_common.sh@10 -- # set +x 00:09:23.321 ************************************ 00:09:23.321 START TEST bdev_raid 00:09:23.321 ************************************ 00:09:23.321 13:32:10 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:23.579 * Looking for test storage... 00:09:23.579 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:23.579 13:32:11 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:09:23.579 13:32:11 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:09:23.579 13:32:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:23.579 13:32:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.579 13:32:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:23.579 ************************************ 00:09:23.579 START TEST raid_function_test_raid0 00:09:23.579 ************************************ 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=4163418 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 4163418' 00:09:23.579 Process raid pid: 4163418 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 4163418 /var/tmp/spdk-raid.sock 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 4163418 ']' 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:23.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:23.579 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:23.579 [2024-07-15 13:32:11.164069] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:23.579 [2024-07-15 13:32:11.164125] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:23.837 [2024-07-15 13:32:11.249776] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.837 [2024-07-15 13:32:11.340839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.837 [2024-07-15 13:32:11.399288] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:23.837 [2024-07-15 13:32:11.399314] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:24.404 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:24.404 13:32:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:09:24.404 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:09:24.404 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:09:24.404 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:24.404 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:09:24.404 13:32:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:24.663 [2024-07-15 13:32:12.153935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:24.663 [2024-07-15 13:32:12.154762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:24.663 [2024-07-15 13:32:12.154809] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf040a0 00:09:24.663 [2024-07-15 13:32:12.154817] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:24.663 [2024-07-15 13:32:12.155028] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf079f0 00:09:24.663 [2024-07-15 13:32:12.155113] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf040a0 00:09:24.663 [2024-07-15 13:32:12.155119] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xf040a0 00:09:24.663 [2024-07-15 13:32:12.155206] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:24.663 Base_1 00:09:24.663 Base_2 00:09:24.663 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:24.663 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:24.663 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:24.922 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:24.922 [2024-07-15 13:32:12.522908] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf07800 00:09:24.922 /dev/nbd0 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:25.180 1+0 records in 00:09:25.180 1+0 records out 00:09:25.180 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283196 s, 14.5 MB/s 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:25.180 { 00:09:25.180 "nbd_device": "/dev/nbd0", 00:09:25.180 "bdev_name": "raid" 00:09:25.180 } 00:09:25.180 ]' 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:25.180 { 00:09:25.180 "nbd_device": "/dev/nbd0", 00:09:25.180 "bdev_name": "raid" 00:09:25.180 } 00:09:25.180 ]' 00:09:25.180 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:25.438 4096+0 records in 00:09:25.438 4096+0 records out 00:09:25.438 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0275677 s, 76.1 MB/s 00:09:25.438 13:32:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:25.438 4096+0 records in 00:09:25.438 4096+0 records out 00:09:25.438 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.144952 s, 14.5 MB/s 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:25.438 128+0 records in 00:09:25.438 128+0 records out 00:09:25.438 65536 bytes (66 kB, 64 KiB) copied, 0.000242869 s, 270 MB/s 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:25.438 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:25.697 2035+0 records in 00:09:25.697 2035+0 records out 00:09:25.697 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0122142 s, 85.3 MB/s 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:25.697 456+0 records in 00:09:25.697 456+0 records out 00:09:25.697 233472 bytes (233 kB, 228 KiB) copied, 0.00275169 s, 84.8 MB/s 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:25.697 [2024-07-15 13:32:13.286659] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.697 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 4163418 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 4163418 ']' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 4163418 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4163418 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4163418' 00:09:25.956 killing process with pid 4163418 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 4163418 00:09:25.956 [2024-07-15 13:32:13.570652] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:25.956 [2024-07-15 13:32:13.570710] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:25.956 [2024-07-15 13:32:13.570742] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:25.956 [2024-07-15 13:32:13.570750] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf040a0 name raid, state offline 00:09:25.956 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 4163418 00:09:26.215 [2024-07-15 13:32:13.587081] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:26.215 13:32:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:09:26.215 00:09:26.215 real 0m2.680s 00:09:26.215 user 0m3.426s 00:09:26.215 sys 0m1.020s 00:09:26.215 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:26.215 13:32:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:26.215 ************************************ 00:09:26.215 END TEST raid_function_test_raid0 00:09:26.215 ************************************ 00:09:26.216 13:32:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:26.216 13:32:13 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:09:26.216 13:32:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:26.216 13:32:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:26.475 13:32:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:26.475 ************************************ 00:09:26.475 START TEST raid_function_test_concat 00:09:26.475 ************************************ 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=4163860 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 4163860' 00:09:26.475 Process raid pid: 4163860 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 4163860 /var/tmp/spdk-raid.sock 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 4163860 ']' 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:26.475 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:26.475 13:32:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:26.475 [2024-07-15 13:32:13.928813] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:26.475 [2024-07-15 13:32:13.928869] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:26.476 [2024-07-15 13:32:14.016045] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:26.735 [2024-07-15 13:32:14.107590] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.735 [2024-07-15 13:32:14.165295] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:26.735 [2024-07-15 13:32:14.165322] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:27.357 [2024-07-15 13:32:14.931195] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:27.357 [2024-07-15 13:32:14.931973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:27.357 [2024-07-15 13:32:14.932032] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26920a0 00:09:27.357 [2024-07-15 13:32:14.932041] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:27.357 [2024-07-15 13:32:14.932221] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2695a30 00:09:27.357 [2024-07-15 13:32:14.932299] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26920a0 00:09:27.357 [2024-07-15 13:32:14.932305] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x26920a0 00:09:27.357 [2024-07-15 13:32:14.932376] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:27.357 Base_1 00:09:27.357 Base_2 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:27.357 13:32:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:27.615 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:27.615 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:27.615 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:27.615 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:27.615 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:27.615 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:27.616 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:27.616 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:27.616 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:09:27.616 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:27.616 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:27.616 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:27.874 [2024-07-15 13:32:15.292165] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2695840 00:09:27.875 /dev/nbd0 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:27.875 1+0 records in 00:09:27.875 1+0 records out 00:09:27.875 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245291 s, 16.7 MB/s 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:27.875 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:28.134 { 00:09:28.134 "nbd_device": "/dev/nbd0", 00:09:28.134 "bdev_name": "raid" 00:09:28.134 } 00:09:28.134 ]' 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:28.134 { 00:09:28.134 "nbd_device": "/dev/nbd0", 00:09:28.134 "bdev_name": "raid" 00:09:28.134 } 00:09:28.134 ]' 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:28.134 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:28.135 4096+0 records in 00:09:28.135 4096+0 records out 00:09:28.135 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0302566 s, 69.3 MB/s 00:09:28.135 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:28.393 4096+0 records in 00:09:28.393 4096+0 records out 00:09:28.393 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.197705 s, 10.6 MB/s 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:28.393 128+0 records in 00:09:28.393 128+0 records out 00:09:28.393 65536 bytes (66 kB, 64 KiB) copied, 0.000815517 s, 80.4 MB/s 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:28.393 2035+0 records in 00:09:28.393 2035+0 records out 00:09:28.393 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0113772 s, 91.6 MB/s 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:28.393 456+0 records in 00:09:28.393 456+0 records out 00:09:28.393 233472 bytes (233 kB, 228 KiB) copied, 0.00223346 s, 105 MB/s 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:28.393 13:32:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:28.651 [2024-07-15 13:32:16.090158] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:28.651 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 4163860 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 4163860 ']' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 4163860 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4163860 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4163860' 00:09:28.910 killing process with pid 4163860 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 4163860 00:09:28.910 [2024-07-15 13:32:16.378050] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:28.910 [2024-07-15 13:32:16.378106] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:28.910 [2024-07-15 13:32:16.378139] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:28.910 [2024-07-15 13:32:16.378147] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26920a0 name raid, state offline 00:09:28.910 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 4163860 00:09:28.910 [2024-07-15 13:32:16.395852] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:29.167 13:32:16 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:09:29.167 00:09:29.167 real 0m2.728s 00:09:29.167 user 0m3.462s 00:09:29.167 sys 0m1.016s 00:09:29.167 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:29.167 13:32:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:29.167 ************************************ 00:09:29.167 END TEST raid_function_test_concat 00:09:29.167 ************************************ 00:09:29.167 13:32:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:29.167 13:32:16 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:09:29.167 13:32:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:29.167 13:32:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:29.167 13:32:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:29.167 ************************************ 00:09:29.167 START TEST raid0_resize_test 00:09:29.167 ************************************ 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=4164303 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 4164303' 00:09:29.167 Process raid pid: 4164303 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 4164303 /var/tmp/spdk-raid.sock 00:09:29.167 13:32:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 4164303 ']' 00:09:29.168 13:32:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:29.168 13:32:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:29.168 13:32:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:29.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:29.168 13:32:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:29.168 13:32:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:29.168 [2024-07-15 13:32:16.732865] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:29.168 [2024-07-15 13:32:16.732912] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:29.424 [2024-07-15 13:32:16.821672] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:29.424 [2024-07-15 13:32:16.908873] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.424 [2024-07-15 13:32:16.958014] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:29.424 [2024-07-15 13:32:16.958037] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:29.988 13:32:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:29.988 13:32:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:09:29.988 13:32:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:09:30.246 Base_1 00:09:30.246 13:32:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:09:30.246 Base_2 00:09:30.246 13:32:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:09:30.506 [2024-07-15 13:32:18.003394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:30.506 [2024-07-15 13:32:18.004527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:30.506 [2024-07-15 13:32:18.004567] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1406250 00:09:30.506 [2024-07-15 13:32:18.004574] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:30.506 [2024-07-15 13:32:18.004729] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1406530 00:09:30.506 [2024-07-15 13:32:18.004798] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1406250 00:09:30.506 [2024-07-15 13:32:18.004804] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1406250 00:09:30.506 [2024-07-15 13:32:18.004890] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:30.506 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:09:30.764 [2024-07-15 13:32:18.171816] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:30.764 [2024-07-15 13:32:18.171836] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:09:30.764 true 00:09:30.764 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:30.764 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:09:30.764 [2024-07-15 13:32:18.336336] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:30.764 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:09:30.764 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:09:30.764 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:09:30.764 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:09:31.023 [2024-07-15 13:32:18.508670] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:31.023 [2024-07-15 13:32:18.508689] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:09:31.023 [2024-07-15 13:32:18.508707] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:09:31.023 true 00:09:31.023 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:31.023 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:09:31.282 [2024-07-15 13:32:18.673195] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 4164303 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 4164303 ']' 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 4164303 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4164303 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4164303' 00:09:31.282 killing process with pid 4164303 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 4164303 00:09:31.282 [2024-07-15 13:32:18.733107] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:31.282 [2024-07-15 13:32:18.733150] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:31.282 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 4164303 00:09:31.282 [2024-07-15 13:32:18.733183] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:31.282 [2024-07-15 13:32:18.733192] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1406250 name Raid, state offline 00:09:31.282 [2024-07-15 13:32:18.734369] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:31.541 13:32:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:09:31.541 00:09:31.541 real 0m2.240s 00:09:31.541 user 0m3.285s 00:09:31.541 sys 0m0.487s 00:09:31.541 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:31.541 13:32:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:31.541 ************************************ 00:09:31.541 END TEST raid0_resize_test 00:09:31.542 ************************************ 00:09:31.542 13:32:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:31.542 13:32:18 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:09:31.542 13:32:18 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:31.542 13:32:18 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:09:31.542 13:32:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:31.542 13:32:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:31.542 13:32:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:31.542 ************************************ 00:09:31.542 START TEST raid_state_function_test 00:09:31.542 ************************************ 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4164626 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4164626' 00:09:31.542 Process raid pid: 4164626 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4164626 /var/tmp/spdk-raid.sock 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4164626 ']' 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:31.542 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:31.542 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:31.542 [2024-07-15 13:32:19.065183] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:31.542 [2024-07-15 13:32:19.065245] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:31.542 [2024-07-15 13:32:19.154611] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:31.801 [2024-07-15 13:32:19.237240] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.801 [2024-07-15 13:32:19.295378] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:31.801 [2024-07-15 13:32:19.295404] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:32.368 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:32.368 13:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:09:32.368 13:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:32.627 [2024-07-15 13:32:20.013008] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:32.627 [2024-07-15 13:32:20.013045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:32.627 [2024-07-15 13:32:20.013053] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:32.627 [2024-07-15 13:32:20.013061] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:32.627 "name": "Existed_Raid", 00:09:32.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:32.627 "strip_size_kb": 64, 00:09:32.627 "state": "configuring", 00:09:32.627 "raid_level": "raid0", 00:09:32.627 "superblock": false, 00:09:32.627 "num_base_bdevs": 2, 00:09:32.627 "num_base_bdevs_discovered": 0, 00:09:32.627 "num_base_bdevs_operational": 2, 00:09:32.627 "base_bdevs_list": [ 00:09:32.627 { 00:09:32.627 "name": "BaseBdev1", 00:09:32.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:32.627 "is_configured": false, 00:09:32.627 "data_offset": 0, 00:09:32.627 "data_size": 0 00:09:32.627 }, 00:09:32.627 { 00:09:32.627 "name": "BaseBdev2", 00:09:32.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:32.627 "is_configured": false, 00:09:32.627 "data_offset": 0, 00:09:32.627 "data_size": 0 00:09:32.627 } 00:09:32.627 ] 00:09:32.627 }' 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:32.627 13:32:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:33.194 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:33.453 [2024-07-15 13:32:20.847061] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:33.453 [2024-07-15 13:32:20.847088] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x224bf30 name Existed_Raid, state configuring 00:09:33.453 13:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:33.453 [2024-07-15 13:32:21.019515] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:33.453 [2024-07-15 13:32:21.019537] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:33.453 [2024-07-15 13:32:21.019543] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:33.453 [2024-07-15 13:32:21.019551] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:33.453 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:33.713 [2024-07-15 13:32:21.200632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:33.713 BaseBdev1 00:09:33.713 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:33.713 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:33.713 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:33.713 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:33.713 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:33.713 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:33.713 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:33.971 [ 00:09:33.971 { 00:09:33.971 "name": "BaseBdev1", 00:09:33.971 "aliases": [ 00:09:33.971 "9149a4b0-63ac-4c44-af84-8c9aa6fbb040" 00:09:33.971 ], 00:09:33.971 "product_name": "Malloc disk", 00:09:33.971 "block_size": 512, 00:09:33.971 "num_blocks": 65536, 00:09:33.971 "uuid": "9149a4b0-63ac-4c44-af84-8c9aa6fbb040", 00:09:33.971 "assigned_rate_limits": { 00:09:33.971 "rw_ios_per_sec": 0, 00:09:33.971 "rw_mbytes_per_sec": 0, 00:09:33.971 "r_mbytes_per_sec": 0, 00:09:33.971 "w_mbytes_per_sec": 0 00:09:33.971 }, 00:09:33.971 "claimed": true, 00:09:33.971 "claim_type": "exclusive_write", 00:09:33.971 "zoned": false, 00:09:33.971 "supported_io_types": { 00:09:33.971 "read": true, 00:09:33.971 "write": true, 00:09:33.971 "unmap": true, 00:09:33.971 "flush": true, 00:09:33.971 "reset": true, 00:09:33.971 "nvme_admin": false, 00:09:33.971 "nvme_io": false, 00:09:33.971 "nvme_io_md": false, 00:09:33.971 "write_zeroes": true, 00:09:33.971 "zcopy": true, 00:09:33.971 "get_zone_info": false, 00:09:33.971 "zone_management": false, 00:09:33.971 "zone_append": false, 00:09:33.971 "compare": false, 00:09:33.971 "compare_and_write": false, 00:09:33.971 "abort": true, 00:09:33.971 "seek_hole": false, 00:09:33.971 "seek_data": false, 00:09:33.971 "copy": true, 00:09:33.971 "nvme_iov_md": false 00:09:33.971 }, 00:09:33.971 "memory_domains": [ 00:09:33.971 { 00:09:33.971 "dma_device_id": "system", 00:09:33.971 "dma_device_type": 1 00:09:33.971 }, 00:09:33.971 { 00:09:33.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:33.971 "dma_device_type": 2 00:09:33.971 } 00:09:33.971 ], 00:09:33.971 "driver_specific": {} 00:09:33.971 } 00:09:33.971 ] 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:33.971 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:34.228 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:34.228 "name": "Existed_Raid", 00:09:34.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:34.228 "strip_size_kb": 64, 00:09:34.228 "state": "configuring", 00:09:34.228 "raid_level": "raid0", 00:09:34.228 "superblock": false, 00:09:34.228 "num_base_bdevs": 2, 00:09:34.228 "num_base_bdevs_discovered": 1, 00:09:34.228 "num_base_bdevs_operational": 2, 00:09:34.228 "base_bdevs_list": [ 00:09:34.228 { 00:09:34.228 "name": "BaseBdev1", 00:09:34.228 "uuid": "9149a4b0-63ac-4c44-af84-8c9aa6fbb040", 00:09:34.228 "is_configured": true, 00:09:34.228 "data_offset": 0, 00:09:34.228 "data_size": 65536 00:09:34.228 }, 00:09:34.228 { 00:09:34.228 "name": "BaseBdev2", 00:09:34.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:34.228 "is_configured": false, 00:09:34.228 "data_offset": 0, 00:09:34.228 "data_size": 0 00:09:34.228 } 00:09:34.228 ] 00:09:34.228 }' 00:09:34.228 13:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:34.228 13:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:34.793 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:34.793 [2024-07-15 13:32:22.379679] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:34.793 [2024-07-15 13:32:22.379718] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x224b820 name Existed_Raid, state configuring 00:09:34.793 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:35.051 [2024-07-15 13:32:22.556162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:35.051 [2024-07-15 13:32:22.557215] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:35.052 [2024-07-15 13:32:22.557241] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:35.052 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:35.310 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:35.310 "name": "Existed_Raid", 00:09:35.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.310 "strip_size_kb": 64, 00:09:35.310 "state": "configuring", 00:09:35.310 "raid_level": "raid0", 00:09:35.310 "superblock": false, 00:09:35.310 "num_base_bdevs": 2, 00:09:35.310 "num_base_bdevs_discovered": 1, 00:09:35.310 "num_base_bdevs_operational": 2, 00:09:35.310 "base_bdevs_list": [ 00:09:35.310 { 00:09:35.310 "name": "BaseBdev1", 00:09:35.310 "uuid": "9149a4b0-63ac-4c44-af84-8c9aa6fbb040", 00:09:35.310 "is_configured": true, 00:09:35.310 "data_offset": 0, 00:09:35.310 "data_size": 65536 00:09:35.310 }, 00:09:35.310 { 00:09:35.310 "name": "BaseBdev2", 00:09:35.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.310 "is_configured": false, 00:09:35.310 "data_offset": 0, 00:09:35.310 "data_size": 0 00:09:35.310 } 00:09:35.310 ] 00:09:35.310 }' 00:09:35.310 13:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:35.310 13:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:35.874 [2024-07-15 13:32:23.421237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:35.874 [2024-07-15 13:32:23.421269] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x224c610 00:09:35.874 [2024-07-15 13:32:23.421275] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:35.874 [2024-07-15 13:32:23.421409] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23effb0 00:09:35.874 [2024-07-15 13:32:23.421493] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x224c610 00:09:35.874 [2024-07-15 13:32:23.421499] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x224c610 00:09:35.874 [2024-07-15 13:32:23.421629] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:35.874 BaseBdev2 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:35.874 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:36.132 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:36.390 [ 00:09:36.390 { 00:09:36.390 "name": "BaseBdev2", 00:09:36.390 "aliases": [ 00:09:36.390 "0048d68e-3e06-4e2c-8130-8127bbc22842" 00:09:36.390 ], 00:09:36.390 "product_name": "Malloc disk", 00:09:36.390 "block_size": 512, 00:09:36.390 "num_blocks": 65536, 00:09:36.390 "uuid": "0048d68e-3e06-4e2c-8130-8127bbc22842", 00:09:36.390 "assigned_rate_limits": { 00:09:36.390 "rw_ios_per_sec": 0, 00:09:36.390 "rw_mbytes_per_sec": 0, 00:09:36.390 "r_mbytes_per_sec": 0, 00:09:36.390 "w_mbytes_per_sec": 0 00:09:36.390 }, 00:09:36.390 "claimed": true, 00:09:36.390 "claim_type": "exclusive_write", 00:09:36.390 "zoned": false, 00:09:36.390 "supported_io_types": { 00:09:36.390 "read": true, 00:09:36.390 "write": true, 00:09:36.390 "unmap": true, 00:09:36.390 "flush": true, 00:09:36.390 "reset": true, 00:09:36.390 "nvme_admin": false, 00:09:36.390 "nvme_io": false, 00:09:36.390 "nvme_io_md": false, 00:09:36.390 "write_zeroes": true, 00:09:36.390 "zcopy": true, 00:09:36.390 "get_zone_info": false, 00:09:36.390 "zone_management": false, 00:09:36.390 "zone_append": false, 00:09:36.390 "compare": false, 00:09:36.390 "compare_and_write": false, 00:09:36.390 "abort": true, 00:09:36.390 "seek_hole": false, 00:09:36.390 "seek_data": false, 00:09:36.390 "copy": true, 00:09:36.390 "nvme_iov_md": false 00:09:36.390 }, 00:09:36.390 "memory_domains": [ 00:09:36.390 { 00:09:36.390 "dma_device_id": "system", 00:09:36.390 "dma_device_type": 1 00:09:36.390 }, 00:09:36.390 { 00:09:36.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:36.390 "dma_device_type": 2 00:09:36.390 } 00:09:36.390 ], 00:09:36.390 "driver_specific": {} 00:09:36.390 } 00:09:36.390 ] 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:36.390 "name": "Existed_Raid", 00:09:36.390 "uuid": "0b6e2453-86af-4647-8e71-5b305685b595", 00:09:36.390 "strip_size_kb": 64, 00:09:36.390 "state": "online", 00:09:36.390 "raid_level": "raid0", 00:09:36.390 "superblock": false, 00:09:36.390 "num_base_bdevs": 2, 00:09:36.390 "num_base_bdevs_discovered": 2, 00:09:36.390 "num_base_bdevs_operational": 2, 00:09:36.390 "base_bdevs_list": [ 00:09:36.390 { 00:09:36.390 "name": "BaseBdev1", 00:09:36.390 "uuid": "9149a4b0-63ac-4c44-af84-8c9aa6fbb040", 00:09:36.390 "is_configured": true, 00:09:36.390 "data_offset": 0, 00:09:36.390 "data_size": 65536 00:09:36.390 }, 00:09:36.390 { 00:09:36.390 "name": "BaseBdev2", 00:09:36.390 "uuid": "0048d68e-3e06-4e2c-8130-8127bbc22842", 00:09:36.390 "is_configured": true, 00:09:36.390 "data_offset": 0, 00:09:36.390 "data_size": 65536 00:09:36.390 } 00:09:36.390 ] 00:09:36.390 }' 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:36.390 13:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:36.955 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:37.213 [2024-07-15 13:32:24.636601] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:37.213 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:37.213 "name": "Existed_Raid", 00:09:37.213 "aliases": [ 00:09:37.213 "0b6e2453-86af-4647-8e71-5b305685b595" 00:09:37.213 ], 00:09:37.213 "product_name": "Raid Volume", 00:09:37.213 "block_size": 512, 00:09:37.213 "num_blocks": 131072, 00:09:37.213 "uuid": "0b6e2453-86af-4647-8e71-5b305685b595", 00:09:37.213 "assigned_rate_limits": { 00:09:37.213 "rw_ios_per_sec": 0, 00:09:37.213 "rw_mbytes_per_sec": 0, 00:09:37.213 "r_mbytes_per_sec": 0, 00:09:37.213 "w_mbytes_per_sec": 0 00:09:37.213 }, 00:09:37.213 "claimed": false, 00:09:37.213 "zoned": false, 00:09:37.213 "supported_io_types": { 00:09:37.213 "read": true, 00:09:37.213 "write": true, 00:09:37.213 "unmap": true, 00:09:37.213 "flush": true, 00:09:37.213 "reset": true, 00:09:37.213 "nvme_admin": false, 00:09:37.213 "nvme_io": false, 00:09:37.213 "nvme_io_md": false, 00:09:37.213 "write_zeroes": true, 00:09:37.213 "zcopy": false, 00:09:37.213 "get_zone_info": false, 00:09:37.213 "zone_management": false, 00:09:37.213 "zone_append": false, 00:09:37.213 "compare": false, 00:09:37.213 "compare_and_write": false, 00:09:37.213 "abort": false, 00:09:37.213 "seek_hole": false, 00:09:37.213 "seek_data": false, 00:09:37.213 "copy": false, 00:09:37.213 "nvme_iov_md": false 00:09:37.213 }, 00:09:37.213 "memory_domains": [ 00:09:37.213 { 00:09:37.213 "dma_device_id": "system", 00:09:37.213 "dma_device_type": 1 00:09:37.213 }, 00:09:37.213 { 00:09:37.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:37.213 "dma_device_type": 2 00:09:37.213 }, 00:09:37.213 { 00:09:37.213 "dma_device_id": "system", 00:09:37.213 "dma_device_type": 1 00:09:37.213 }, 00:09:37.213 { 00:09:37.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:37.213 "dma_device_type": 2 00:09:37.213 } 00:09:37.213 ], 00:09:37.213 "driver_specific": { 00:09:37.214 "raid": { 00:09:37.214 "uuid": "0b6e2453-86af-4647-8e71-5b305685b595", 00:09:37.214 "strip_size_kb": 64, 00:09:37.214 "state": "online", 00:09:37.214 "raid_level": "raid0", 00:09:37.214 "superblock": false, 00:09:37.214 "num_base_bdevs": 2, 00:09:37.214 "num_base_bdevs_discovered": 2, 00:09:37.214 "num_base_bdevs_operational": 2, 00:09:37.214 "base_bdevs_list": [ 00:09:37.214 { 00:09:37.214 "name": "BaseBdev1", 00:09:37.214 "uuid": "9149a4b0-63ac-4c44-af84-8c9aa6fbb040", 00:09:37.214 "is_configured": true, 00:09:37.214 "data_offset": 0, 00:09:37.214 "data_size": 65536 00:09:37.214 }, 00:09:37.214 { 00:09:37.214 "name": "BaseBdev2", 00:09:37.214 "uuid": "0048d68e-3e06-4e2c-8130-8127bbc22842", 00:09:37.214 "is_configured": true, 00:09:37.214 "data_offset": 0, 00:09:37.214 "data_size": 65536 00:09:37.214 } 00:09:37.214 ] 00:09:37.214 } 00:09:37.214 } 00:09:37.214 }' 00:09:37.214 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:37.214 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:37.214 BaseBdev2' 00:09:37.214 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:37.214 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:37.214 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:37.471 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:37.471 "name": "BaseBdev1", 00:09:37.471 "aliases": [ 00:09:37.471 "9149a4b0-63ac-4c44-af84-8c9aa6fbb040" 00:09:37.471 ], 00:09:37.471 "product_name": "Malloc disk", 00:09:37.471 "block_size": 512, 00:09:37.471 "num_blocks": 65536, 00:09:37.471 "uuid": "9149a4b0-63ac-4c44-af84-8c9aa6fbb040", 00:09:37.471 "assigned_rate_limits": { 00:09:37.471 "rw_ios_per_sec": 0, 00:09:37.471 "rw_mbytes_per_sec": 0, 00:09:37.471 "r_mbytes_per_sec": 0, 00:09:37.471 "w_mbytes_per_sec": 0 00:09:37.471 }, 00:09:37.471 "claimed": true, 00:09:37.471 "claim_type": "exclusive_write", 00:09:37.471 "zoned": false, 00:09:37.471 "supported_io_types": { 00:09:37.471 "read": true, 00:09:37.471 "write": true, 00:09:37.471 "unmap": true, 00:09:37.471 "flush": true, 00:09:37.471 "reset": true, 00:09:37.471 "nvme_admin": false, 00:09:37.471 "nvme_io": false, 00:09:37.471 "nvme_io_md": false, 00:09:37.471 "write_zeroes": true, 00:09:37.471 "zcopy": true, 00:09:37.471 "get_zone_info": false, 00:09:37.471 "zone_management": false, 00:09:37.471 "zone_append": false, 00:09:37.471 "compare": false, 00:09:37.471 "compare_and_write": false, 00:09:37.471 "abort": true, 00:09:37.471 "seek_hole": false, 00:09:37.471 "seek_data": false, 00:09:37.471 "copy": true, 00:09:37.471 "nvme_iov_md": false 00:09:37.471 }, 00:09:37.471 "memory_domains": [ 00:09:37.471 { 00:09:37.471 "dma_device_id": "system", 00:09:37.471 "dma_device_type": 1 00:09:37.471 }, 00:09:37.471 { 00:09:37.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:37.471 "dma_device_type": 2 00:09:37.471 } 00:09:37.471 ], 00:09:37.471 "driver_specific": {} 00:09:37.471 }' 00:09:37.471 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:37.471 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:37.471 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:37.471 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:37.471 13:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:37.471 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:37.471 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:37.471 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:37.728 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:37.728 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:37.728 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:37.728 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:37.728 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:37.728 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:37.728 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:37.985 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:37.985 "name": "BaseBdev2", 00:09:37.985 "aliases": [ 00:09:37.985 "0048d68e-3e06-4e2c-8130-8127bbc22842" 00:09:37.985 ], 00:09:37.985 "product_name": "Malloc disk", 00:09:37.985 "block_size": 512, 00:09:37.986 "num_blocks": 65536, 00:09:37.986 "uuid": "0048d68e-3e06-4e2c-8130-8127bbc22842", 00:09:37.986 "assigned_rate_limits": { 00:09:37.986 "rw_ios_per_sec": 0, 00:09:37.986 "rw_mbytes_per_sec": 0, 00:09:37.986 "r_mbytes_per_sec": 0, 00:09:37.986 "w_mbytes_per_sec": 0 00:09:37.986 }, 00:09:37.986 "claimed": true, 00:09:37.986 "claim_type": "exclusive_write", 00:09:37.986 "zoned": false, 00:09:37.986 "supported_io_types": { 00:09:37.986 "read": true, 00:09:37.986 "write": true, 00:09:37.986 "unmap": true, 00:09:37.986 "flush": true, 00:09:37.986 "reset": true, 00:09:37.986 "nvme_admin": false, 00:09:37.986 "nvme_io": false, 00:09:37.986 "nvme_io_md": false, 00:09:37.986 "write_zeroes": true, 00:09:37.986 "zcopy": true, 00:09:37.986 "get_zone_info": false, 00:09:37.986 "zone_management": false, 00:09:37.986 "zone_append": false, 00:09:37.986 "compare": false, 00:09:37.986 "compare_and_write": false, 00:09:37.986 "abort": true, 00:09:37.986 "seek_hole": false, 00:09:37.986 "seek_data": false, 00:09:37.986 "copy": true, 00:09:37.986 "nvme_iov_md": false 00:09:37.986 }, 00:09:37.986 "memory_domains": [ 00:09:37.986 { 00:09:37.986 "dma_device_id": "system", 00:09:37.986 "dma_device_type": 1 00:09:37.986 }, 00:09:37.986 { 00:09:37.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:37.986 "dma_device_type": 2 00:09:37.986 } 00:09:37.986 ], 00:09:37.986 "driver_specific": {} 00:09:37.986 }' 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:37.986 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:38.243 [2024-07-15 13:32:25.835530] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:38.243 [2024-07-15 13:32:25.835556] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:38.243 [2024-07-15 13:32:25.835585] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:38.243 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:38.500 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:38.500 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:38.500 13:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:38.500 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:38.500 "name": "Existed_Raid", 00:09:38.500 "uuid": "0b6e2453-86af-4647-8e71-5b305685b595", 00:09:38.500 "strip_size_kb": 64, 00:09:38.500 "state": "offline", 00:09:38.500 "raid_level": "raid0", 00:09:38.500 "superblock": false, 00:09:38.500 "num_base_bdevs": 2, 00:09:38.500 "num_base_bdevs_discovered": 1, 00:09:38.500 "num_base_bdevs_operational": 1, 00:09:38.500 "base_bdevs_list": [ 00:09:38.500 { 00:09:38.500 "name": null, 00:09:38.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:38.500 "is_configured": false, 00:09:38.500 "data_offset": 0, 00:09:38.500 "data_size": 65536 00:09:38.500 }, 00:09:38.500 { 00:09:38.500 "name": "BaseBdev2", 00:09:38.500 "uuid": "0048d68e-3e06-4e2c-8130-8127bbc22842", 00:09:38.500 "is_configured": true, 00:09:38.500 "data_offset": 0, 00:09:38.500 "data_size": 65536 00:09:38.500 } 00:09:38.500 ] 00:09:38.500 }' 00:09:38.500 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:38.500 13:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:39.064 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:39.064 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:39.064 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:39.064 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:39.064 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:39.064 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:39.064 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:39.319 [2024-07-15 13:32:26.826969] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:39.319 [2024-07-15 13:32:26.827028] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x224c610 name Existed_Raid, state offline 00:09:39.319 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:39.319 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:39.319 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:39.319 13:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4164626 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4164626 ']' 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4164626 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4164626 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4164626' 00:09:39.575 killing process with pid 4164626 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4164626 00:09:39.575 [2024-07-15 13:32:27.081706] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:39.575 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4164626 00:09:39.575 [2024-07-15 13:32:27.082634] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:39.832 00:09:39.832 real 0m8.289s 00:09:39.832 user 0m14.519s 00:09:39.832 sys 0m1.627s 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:39.832 ************************************ 00:09:39.832 END TEST raid_state_function_test 00:09:39.832 ************************************ 00:09:39.832 13:32:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:39.832 13:32:27 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:09:39.832 13:32:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:39.832 13:32:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:39.832 13:32:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:39.832 ************************************ 00:09:39.832 START TEST raid_state_function_test_sb 00:09:39.832 ************************************ 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4165983 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4165983' 00:09:39.832 Process raid pid: 4165983 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4165983 /var/tmp/spdk-raid.sock 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4165983 ']' 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:39.832 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:39.832 13:32:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:39.832 [2024-07-15 13:32:27.432924] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:39.832 [2024-07-15 13:32:27.432972] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:40.088 [2024-07-15 13:32:27.520213] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:40.088 [2024-07-15 13:32:27.607226] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:40.088 [2024-07-15 13:32:27.660076] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:40.088 [2024-07-15 13:32:27.660101] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:40.651 13:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:40.651 13:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:09:40.651 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:40.909 [2024-07-15 13:32:28.390036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:40.909 [2024-07-15 13:32:28.390071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:40.909 [2024-07-15 13:32:28.390079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:40.909 [2024-07-15 13:32:28.390087] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:40.909 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:41.166 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:41.166 "name": "Existed_Raid", 00:09:41.166 "uuid": "34db9570-7aa9-418f-8f10-86078a4c69cb", 00:09:41.166 "strip_size_kb": 64, 00:09:41.167 "state": "configuring", 00:09:41.167 "raid_level": "raid0", 00:09:41.167 "superblock": true, 00:09:41.167 "num_base_bdevs": 2, 00:09:41.167 "num_base_bdevs_discovered": 0, 00:09:41.167 "num_base_bdevs_operational": 2, 00:09:41.167 "base_bdevs_list": [ 00:09:41.167 { 00:09:41.167 "name": "BaseBdev1", 00:09:41.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:41.167 "is_configured": false, 00:09:41.167 "data_offset": 0, 00:09:41.167 "data_size": 0 00:09:41.167 }, 00:09:41.167 { 00:09:41.167 "name": "BaseBdev2", 00:09:41.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:41.167 "is_configured": false, 00:09:41.167 "data_offset": 0, 00:09:41.167 "data_size": 0 00:09:41.167 } 00:09:41.167 ] 00:09:41.167 }' 00:09:41.167 13:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:41.167 13:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:41.732 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:41.732 [2024-07-15 13:32:29.228125] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:41.732 [2024-07-15 13:32:29.228151] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1032f30 name Existed_Raid, state configuring 00:09:41.732 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:41.990 [2024-07-15 13:32:29.404599] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:41.990 [2024-07-15 13:32:29.404624] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:41.990 [2024-07-15 13:32:29.404630] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:41.990 [2024-07-15 13:32:29.404637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:41.990 [2024-07-15 13:32:29.585752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:41.990 BaseBdev1 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:41.990 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:42.247 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:42.505 [ 00:09:42.505 { 00:09:42.505 "name": "BaseBdev1", 00:09:42.505 "aliases": [ 00:09:42.505 "b7de83fe-0654-47be-9ea1-c83ba56631f0" 00:09:42.505 ], 00:09:42.505 "product_name": "Malloc disk", 00:09:42.505 "block_size": 512, 00:09:42.505 "num_blocks": 65536, 00:09:42.505 "uuid": "b7de83fe-0654-47be-9ea1-c83ba56631f0", 00:09:42.505 "assigned_rate_limits": { 00:09:42.505 "rw_ios_per_sec": 0, 00:09:42.505 "rw_mbytes_per_sec": 0, 00:09:42.505 "r_mbytes_per_sec": 0, 00:09:42.505 "w_mbytes_per_sec": 0 00:09:42.505 }, 00:09:42.505 "claimed": true, 00:09:42.505 "claim_type": "exclusive_write", 00:09:42.505 "zoned": false, 00:09:42.505 "supported_io_types": { 00:09:42.505 "read": true, 00:09:42.505 "write": true, 00:09:42.505 "unmap": true, 00:09:42.505 "flush": true, 00:09:42.505 "reset": true, 00:09:42.505 "nvme_admin": false, 00:09:42.505 "nvme_io": false, 00:09:42.505 "nvme_io_md": false, 00:09:42.505 "write_zeroes": true, 00:09:42.505 "zcopy": true, 00:09:42.505 "get_zone_info": false, 00:09:42.505 "zone_management": false, 00:09:42.505 "zone_append": false, 00:09:42.505 "compare": false, 00:09:42.505 "compare_and_write": false, 00:09:42.505 "abort": true, 00:09:42.505 "seek_hole": false, 00:09:42.505 "seek_data": false, 00:09:42.505 "copy": true, 00:09:42.505 "nvme_iov_md": false 00:09:42.505 }, 00:09:42.505 "memory_domains": [ 00:09:42.505 { 00:09:42.505 "dma_device_id": "system", 00:09:42.505 "dma_device_type": 1 00:09:42.505 }, 00:09:42.505 { 00:09:42.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:42.505 "dma_device_type": 2 00:09:42.505 } 00:09:42.505 ], 00:09:42.505 "driver_specific": {} 00:09:42.505 } 00:09:42.505 ] 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:42.505 13:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:42.763 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:42.763 "name": "Existed_Raid", 00:09:42.763 "uuid": "2fb53067-930f-46d9-a151-0f4bc18351ce", 00:09:42.763 "strip_size_kb": 64, 00:09:42.763 "state": "configuring", 00:09:42.763 "raid_level": "raid0", 00:09:42.763 "superblock": true, 00:09:42.763 "num_base_bdevs": 2, 00:09:42.763 "num_base_bdevs_discovered": 1, 00:09:42.763 "num_base_bdevs_operational": 2, 00:09:42.763 "base_bdevs_list": [ 00:09:42.763 { 00:09:42.763 "name": "BaseBdev1", 00:09:42.763 "uuid": "b7de83fe-0654-47be-9ea1-c83ba56631f0", 00:09:42.763 "is_configured": true, 00:09:42.763 "data_offset": 2048, 00:09:42.763 "data_size": 63488 00:09:42.763 }, 00:09:42.763 { 00:09:42.763 "name": "BaseBdev2", 00:09:42.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:42.763 "is_configured": false, 00:09:42.763 "data_offset": 0, 00:09:42.763 "data_size": 0 00:09:42.763 } 00:09:42.763 ] 00:09:42.763 }' 00:09:42.763 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:42.763 13:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:43.021 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:43.278 [2024-07-15 13:32:30.772832] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:43.278 [2024-07-15 13:32:30.772866] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1032820 name Existed_Raid, state configuring 00:09:43.278 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:43.536 [2024-07-15 13:32:30.949332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:43.536 [2024-07-15 13:32:30.950435] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:43.536 [2024-07-15 13:32:30.950460] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:43.536 13:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:43.794 13:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:43.794 "name": "Existed_Raid", 00:09:43.794 "uuid": "7d1b3b8f-f410-40d5-9329-5279c0799f86", 00:09:43.794 "strip_size_kb": 64, 00:09:43.794 "state": "configuring", 00:09:43.794 "raid_level": "raid0", 00:09:43.794 "superblock": true, 00:09:43.794 "num_base_bdevs": 2, 00:09:43.794 "num_base_bdevs_discovered": 1, 00:09:43.794 "num_base_bdevs_operational": 2, 00:09:43.794 "base_bdevs_list": [ 00:09:43.794 { 00:09:43.794 "name": "BaseBdev1", 00:09:43.794 "uuid": "b7de83fe-0654-47be-9ea1-c83ba56631f0", 00:09:43.794 "is_configured": true, 00:09:43.794 "data_offset": 2048, 00:09:43.794 "data_size": 63488 00:09:43.794 }, 00:09:43.794 { 00:09:43.794 "name": "BaseBdev2", 00:09:43.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:43.794 "is_configured": false, 00:09:43.794 "data_offset": 0, 00:09:43.794 "data_size": 0 00:09:43.794 } 00:09:43.794 ] 00:09:43.794 }' 00:09:43.794 13:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:43.794 13:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:44.052 13:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:44.310 [2024-07-15 13:32:31.822374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:44.310 [2024-07-15 13:32:31.822490] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1033610 00:09:44.310 [2024-07-15 13:32:31.822499] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:44.310 [2024-07-15 13:32:31.822620] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e6fd0 00:09:44.310 [2024-07-15 13:32:31.822701] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1033610 00:09:44.310 [2024-07-15 13:32:31.822708] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1033610 00:09:44.310 [2024-07-15 13:32:31.822770] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:44.310 BaseBdev2 00:09:44.310 13:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:44.310 13:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:44.310 13:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:44.310 13:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:44.310 13:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:44.310 13:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:44.310 13:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:44.567 13:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:44.567 [ 00:09:44.567 { 00:09:44.567 "name": "BaseBdev2", 00:09:44.567 "aliases": [ 00:09:44.567 "57f06d28-3e78-4e3b-8eb3-50ca67714c38" 00:09:44.567 ], 00:09:44.567 "product_name": "Malloc disk", 00:09:44.567 "block_size": 512, 00:09:44.567 "num_blocks": 65536, 00:09:44.567 "uuid": "57f06d28-3e78-4e3b-8eb3-50ca67714c38", 00:09:44.567 "assigned_rate_limits": { 00:09:44.567 "rw_ios_per_sec": 0, 00:09:44.567 "rw_mbytes_per_sec": 0, 00:09:44.567 "r_mbytes_per_sec": 0, 00:09:44.567 "w_mbytes_per_sec": 0 00:09:44.567 }, 00:09:44.567 "claimed": true, 00:09:44.567 "claim_type": "exclusive_write", 00:09:44.567 "zoned": false, 00:09:44.567 "supported_io_types": { 00:09:44.567 "read": true, 00:09:44.567 "write": true, 00:09:44.567 "unmap": true, 00:09:44.567 "flush": true, 00:09:44.567 "reset": true, 00:09:44.567 "nvme_admin": false, 00:09:44.567 "nvme_io": false, 00:09:44.567 "nvme_io_md": false, 00:09:44.567 "write_zeroes": true, 00:09:44.567 "zcopy": true, 00:09:44.567 "get_zone_info": false, 00:09:44.567 "zone_management": false, 00:09:44.567 "zone_append": false, 00:09:44.567 "compare": false, 00:09:44.567 "compare_and_write": false, 00:09:44.567 "abort": true, 00:09:44.567 "seek_hole": false, 00:09:44.567 "seek_data": false, 00:09:44.567 "copy": true, 00:09:44.567 "nvme_iov_md": false 00:09:44.567 }, 00:09:44.567 "memory_domains": [ 00:09:44.567 { 00:09:44.567 "dma_device_id": "system", 00:09:44.567 "dma_device_type": 1 00:09:44.567 }, 00:09:44.567 { 00:09:44.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.567 "dma_device_type": 2 00:09:44.567 } 00:09:44.567 ], 00:09:44.567 "driver_specific": {} 00:09:44.567 } 00:09:44.567 ] 00:09:44.567 13:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:44.825 "name": "Existed_Raid", 00:09:44.825 "uuid": "7d1b3b8f-f410-40d5-9329-5279c0799f86", 00:09:44.825 "strip_size_kb": 64, 00:09:44.825 "state": "online", 00:09:44.825 "raid_level": "raid0", 00:09:44.825 "superblock": true, 00:09:44.825 "num_base_bdevs": 2, 00:09:44.825 "num_base_bdevs_discovered": 2, 00:09:44.825 "num_base_bdevs_operational": 2, 00:09:44.825 "base_bdevs_list": [ 00:09:44.825 { 00:09:44.825 "name": "BaseBdev1", 00:09:44.825 "uuid": "b7de83fe-0654-47be-9ea1-c83ba56631f0", 00:09:44.825 "is_configured": true, 00:09:44.825 "data_offset": 2048, 00:09:44.825 "data_size": 63488 00:09:44.825 }, 00:09:44.825 { 00:09:44.825 "name": "BaseBdev2", 00:09:44.825 "uuid": "57f06d28-3e78-4e3b-8eb3-50ca67714c38", 00:09:44.825 "is_configured": true, 00:09:44.825 "data_offset": 2048, 00:09:44.825 "data_size": 63488 00:09:44.825 } 00:09:44.825 ] 00:09:44.825 }' 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:44.825 13:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:45.390 13:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:45.647 [2024-07-15 13:32:33.025672] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:45.647 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:45.647 "name": "Existed_Raid", 00:09:45.647 "aliases": [ 00:09:45.647 "7d1b3b8f-f410-40d5-9329-5279c0799f86" 00:09:45.647 ], 00:09:45.647 "product_name": "Raid Volume", 00:09:45.647 "block_size": 512, 00:09:45.647 "num_blocks": 126976, 00:09:45.647 "uuid": "7d1b3b8f-f410-40d5-9329-5279c0799f86", 00:09:45.647 "assigned_rate_limits": { 00:09:45.647 "rw_ios_per_sec": 0, 00:09:45.647 "rw_mbytes_per_sec": 0, 00:09:45.647 "r_mbytes_per_sec": 0, 00:09:45.647 "w_mbytes_per_sec": 0 00:09:45.647 }, 00:09:45.647 "claimed": false, 00:09:45.647 "zoned": false, 00:09:45.647 "supported_io_types": { 00:09:45.647 "read": true, 00:09:45.647 "write": true, 00:09:45.647 "unmap": true, 00:09:45.647 "flush": true, 00:09:45.647 "reset": true, 00:09:45.647 "nvme_admin": false, 00:09:45.647 "nvme_io": false, 00:09:45.647 "nvme_io_md": false, 00:09:45.647 "write_zeroes": true, 00:09:45.647 "zcopy": false, 00:09:45.647 "get_zone_info": false, 00:09:45.647 "zone_management": false, 00:09:45.647 "zone_append": false, 00:09:45.647 "compare": false, 00:09:45.647 "compare_and_write": false, 00:09:45.647 "abort": false, 00:09:45.647 "seek_hole": false, 00:09:45.647 "seek_data": false, 00:09:45.647 "copy": false, 00:09:45.647 "nvme_iov_md": false 00:09:45.647 }, 00:09:45.647 "memory_domains": [ 00:09:45.647 { 00:09:45.647 "dma_device_id": "system", 00:09:45.647 "dma_device_type": 1 00:09:45.647 }, 00:09:45.647 { 00:09:45.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:45.647 "dma_device_type": 2 00:09:45.647 }, 00:09:45.647 { 00:09:45.647 "dma_device_id": "system", 00:09:45.647 "dma_device_type": 1 00:09:45.647 }, 00:09:45.647 { 00:09:45.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:45.647 "dma_device_type": 2 00:09:45.647 } 00:09:45.647 ], 00:09:45.647 "driver_specific": { 00:09:45.647 "raid": { 00:09:45.647 "uuid": "7d1b3b8f-f410-40d5-9329-5279c0799f86", 00:09:45.647 "strip_size_kb": 64, 00:09:45.647 "state": "online", 00:09:45.647 "raid_level": "raid0", 00:09:45.647 "superblock": true, 00:09:45.647 "num_base_bdevs": 2, 00:09:45.647 "num_base_bdevs_discovered": 2, 00:09:45.647 "num_base_bdevs_operational": 2, 00:09:45.647 "base_bdevs_list": [ 00:09:45.647 { 00:09:45.647 "name": "BaseBdev1", 00:09:45.647 "uuid": "b7de83fe-0654-47be-9ea1-c83ba56631f0", 00:09:45.647 "is_configured": true, 00:09:45.647 "data_offset": 2048, 00:09:45.647 "data_size": 63488 00:09:45.647 }, 00:09:45.647 { 00:09:45.647 "name": "BaseBdev2", 00:09:45.647 "uuid": "57f06d28-3e78-4e3b-8eb3-50ca67714c38", 00:09:45.648 "is_configured": true, 00:09:45.648 "data_offset": 2048, 00:09:45.648 "data_size": 63488 00:09:45.648 } 00:09:45.648 ] 00:09:45.648 } 00:09:45.648 } 00:09:45.648 }' 00:09:45.648 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:45.648 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:45.648 BaseBdev2' 00:09:45.648 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:45.648 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:45.648 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:45.904 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:45.904 "name": "BaseBdev1", 00:09:45.904 "aliases": [ 00:09:45.904 "b7de83fe-0654-47be-9ea1-c83ba56631f0" 00:09:45.904 ], 00:09:45.904 "product_name": "Malloc disk", 00:09:45.904 "block_size": 512, 00:09:45.904 "num_blocks": 65536, 00:09:45.904 "uuid": "b7de83fe-0654-47be-9ea1-c83ba56631f0", 00:09:45.904 "assigned_rate_limits": { 00:09:45.904 "rw_ios_per_sec": 0, 00:09:45.904 "rw_mbytes_per_sec": 0, 00:09:45.904 "r_mbytes_per_sec": 0, 00:09:45.904 "w_mbytes_per_sec": 0 00:09:45.904 }, 00:09:45.904 "claimed": true, 00:09:45.904 "claim_type": "exclusive_write", 00:09:45.904 "zoned": false, 00:09:45.904 "supported_io_types": { 00:09:45.904 "read": true, 00:09:45.904 "write": true, 00:09:45.904 "unmap": true, 00:09:45.904 "flush": true, 00:09:45.904 "reset": true, 00:09:45.904 "nvme_admin": false, 00:09:45.904 "nvme_io": false, 00:09:45.904 "nvme_io_md": false, 00:09:45.904 "write_zeroes": true, 00:09:45.904 "zcopy": true, 00:09:45.904 "get_zone_info": false, 00:09:45.904 "zone_management": false, 00:09:45.904 "zone_append": false, 00:09:45.904 "compare": false, 00:09:45.904 "compare_and_write": false, 00:09:45.904 "abort": true, 00:09:45.904 "seek_hole": false, 00:09:45.904 "seek_data": false, 00:09:45.904 "copy": true, 00:09:45.904 "nvme_iov_md": false 00:09:45.904 }, 00:09:45.904 "memory_domains": [ 00:09:45.904 { 00:09:45.904 "dma_device_id": "system", 00:09:45.904 "dma_device_type": 1 00:09:45.904 }, 00:09:45.904 { 00:09:45.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:45.904 "dma_device_type": 2 00:09:45.904 } 00:09:45.904 ], 00:09:45.904 "driver_specific": {} 00:09:45.904 }' 00:09:45.904 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:45.905 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:46.161 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:46.161 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:46.161 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:46.161 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:46.161 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:46.161 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:46.161 "name": "BaseBdev2", 00:09:46.161 "aliases": [ 00:09:46.161 "57f06d28-3e78-4e3b-8eb3-50ca67714c38" 00:09:46.161 ], 00:09:46.161 "product_name": "Malloc disk", 00:09:46.161 "block_size": 512, 00:09:46.161 "num_blocks": 65536, 00:09:46.161 "uuid": "57f06d28-3e78-4e3b-8eb3-50ca67714c38", 00:09:46.161 "assigned_rate_limits": { 00:09:46.161 "rw_ios_per_sec": 0, 00:09:46.161 "rw_mbytes_per_sec": 0, 00:09:46.161 "r_mbytes_per_sec": 0, 00:09:46.161 "w_mbytes_per_sec": 0 00:09:46.161 }, 00:09:46.161 "claimed": true, 00:09:46.161 "claim_type": "exclusive_write", 00:09:46.161 "zoned": false, 00:09:46.161 "supported_io_types": { 00:09:46.161 "read": true, 00:09:46.161 "write": true, 00:09:46.161 "unmap": true, 00:09:46.161 "flush": true, 00:09:46.161 "reset": true, 00:09:46.161 "nvme_admin": false, 00:09:46.161 "nvme_io": false, 00:09:46.161 "nvme_io_md": false, 00:09:46.161 "write_zeroes": true, 00:09:46.161 "zcopy": true, 00:09:46.161 "get_zone_info": false, 00:09:46.161 "zone_management": false, 00:09:46.161 "zone_append": false, 00:09:46.161 "compare": false, 00:09:46.161 "compare_and_write": false, 00:09:46.161 "abort": true, 00:09:46.161 "seek_hole": false, 00:09:46.161 "seek_data": false, 00:09:46.161 "copy": true, 00:09:46.161 "nvme_iov_md": false 00:09:46.161 }, 00:09:46.161 "memory_domains": [ 00:09:46.161 { 00:09:46.161 "dma_device_id": "system", 00:09:46.161 "dma_device_type": 1 00:09:46.161 }, 00:09:46.161 { 00:09:46.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:46.161 "dma_device_type": 2 00:09:46.161 } 00:09:46.161 ], 00:09:46.161 "driver_specific": {} 00:09:46.161 }' 00:09:46.161 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:46.418 13:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:46.418 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:46.675 [2024-07-15 13:32:34.224635] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:46.675 [2024-07-15 13:32:34.224658] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:46.675 [2024-07-15 13:32:34.224689] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:46.675 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:46.934 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:46.934 "name": "Existed_Raid", 00:09:46.934 "uuid": "7d1b3b8f-f410-40d5-9329-5279c0799f86", 00:09:46.934 "strip_size_kb": 64, 00:09:46.934 "state": "offline", 00:09:46.934 "raid_level": "raid0", 00:09:46.934 "superblock": true, 00:09:46.934 "num_base_bdevs": 2, 00:09:46.934 "num_base_bdevs_discovered": 1, 00:09:46.934 "num_base_bdevs_operational": 1, 00:09:46.934 "base_bdevs_list": [ 00:09:46.934 { 00:09:46.934 "name": null, 00:09:46.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:46.934 "is_configured": false, 00:09:46.934 "data_offset": 2048, 00:09:46.934 "data_size": 63488 00:09:46.934 }, 00:09:46.934 { 00:09:46.934 "name": "BaseBdev2", 00:09:46.934 "uuid": "57f06d28-3e78-4e3b-8eb3-50ca67714c38", 00:09:46.934 "is_configured": true, 00:09:46.934 "data_offset": 2048, 00:09:46.934 "data_size": 63488 00:09:46.934 } 00:09:46.934 ] 00:09:46.934 }' 00:09:46.934 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:46.934 13:32:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:47.511 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:47.511 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:47.511 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:47.511 13:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:47.511 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:47.511 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:47.511 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:47.795 [2024-07-15 13:32:35.248010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:47.795 [2024-07-15 13:32:35.248053] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1033610 name Existed_Raid, state offline 00:09:47.795 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:47.795 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:47.795 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:47.795 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4165983 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4165983 ']' 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4165983 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4165983 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4165983' 00:09:48.093 killing process with pid 4165983 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4165983 00:09:48.093 [2024-07-15 13:32:35.486493] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4165983 00:09:48.093 [2024-07-15 13:32:35.487292] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:09:48.093 00:09:48.093 real 0m8.297s 00:09:48.093 user 0m14.570s 00:09:48.093 sys 0m1.689s 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:48.093 13:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:48.093 ************************************ 00:09:48.093 END TEST raid_state_function_test_sb 00:09:48.093 ************************************ 00:09:48.355 13:32:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:48.355 13:32:35 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:09:48.355 13:32:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:48.355 13:32:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:48.355 13:32:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:48.355 ************************************ 00:09:48.355 START TEST raid_superblock_test 00:09:48.355 ************************************ 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:09:48.355 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4167278 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4167278 /var/tmp/spdk-raid.sock 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4167278 ']' 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:48.356 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:48.356 13:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:48.356 [2024-07-15 13:32:35.809919] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:48.356 [2024-07-15 13:32:35.809979] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4167278 ] 00:09:48.356 [2024-07-15 13:32:35.897449] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.612 [2024-07-15 13:32:35.986025] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:48.612 [2024-07-15 13:32:36.047871] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:48.612 [2024-07-15 13:32:36.047904] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:49.177 malloc1 00:09:49.177 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:49.435 [2024-07-15 13:32:36.940632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:49.435 [2024-07-15 13:32:36.940675] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:49.435 [2024-07-15 13:32:36.940703] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25cf260 00:09:49.435 [2024-07-15 13:32:36.940711] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:49.435 [2024-07-15 13:32:36.941901] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:49.435 [2024-07-15 13:32:36.941925] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:49.435 pt1 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:49.435 13:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:49.694 malloc2 00:09:49.694 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:49.694 [2024-07-15 13:32:37.301384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:49.694 [2024-07-15 13:32:37.301424] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:49.694 [2024-07-15 13:32:37.301452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2779310 00:09:49.694 [2024-07-15 13:32:37.301461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:49.694 [2024-07-15 13:32:37.302411] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:49.694 [2024-07-15 13:32:37.302431] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:49.694 pt2 00:09:49.951 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:49.951 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:49.951 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:09:49.951 [2024-07-15 13:32:37.465821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:49.951 [2024-07-15 13:32:37.466622] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:49.951 [2024-07-15 13:32:37.466717] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27785b0 00:09:49.951 [2024-07-15 13:32:37.466726] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:49.951 [2024-07-15 13:32:37.466843] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x277aa10 00:09:49.951 [2024-07-15 13:32:37.466936] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27785b0 00:09:49.952 [2024-07-15 13:32:37.466942] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27785b0 00:09:49.952 [2024-07-15 13:32:37.467010] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.952 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:50.209 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:50.209 "name": "raid_bdev1", 00:09:50.209 "uuid": "b2d93ded-17f7-4cb3-aec0-016dfb60b521", 00:09:50.209 "strip_size_kb": 64, 00:09:50.209 "state": "online", 00:09:50.209 "raid_level": "raid0", 00:09:50.209 "superblock": true, 00:09:50.209 "num_base_bdevs": 2, 00:09:50.209 "num_base_bdevs_discovered": 2, 00:09:50.209 "num_base_bdevs_operational": 2, 00:09:50.209 "base_bdevs_list": [ 00:09:50.209 { 00:09:50.209 "name": "pt1", 00:09:50.209 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:50.209 "is_configured": true, 00:09:50.209 "data_offset": 2048, 00:09:50.209 "data_size": 63488 00:09:50.209 }, 00:09:50.209 { 00:09:50.209 "name": "pt2", 00:09:50.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:50.209 "is_configured": true, 00:09:50.209 "data_offset": 2048, 00:09:50.209 "data_size": 63488 00:09:50.209 } 00:09:50.209 ] 00:09:50.209 }' 00:09:50.209 13:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:50.209 13:32:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:50.774 [2024-07-15 13:32:38.312255] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:50.774 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:50.774 "name": "raid_bdev1", 00:09:50.774 "aliases": [ 00:09:50.774 "b2d93ded-17f7-4cb3-aec0-016dfb60b521" 00:09:50.774 ], 00:09:50.774 "product_name": "Raid Volume", 00:09:50.774 "block_size": 512, 00:09:50.774 "num_blocks": 126976, 00:09:50.774 "uuid": "b2d93ded-17f7-4cb3-aec0-016dfb60b521", 00:09:50.774 "assigned_rate_limits": { 00:09:50.774 "rw_ios_per_sec": 0, 00:09:50.774 "rw_mbytes_per_sec": 0, 00:09:50.774 "r_mbytes_per_sec": 0, 00:09:50.774 "w_mbytes_per_sec": 0 00:09:50.774 }, 00:09:50.774 "claimed": false, 00:09:50.774 "zoned": false, 00:09:50.774 "supported_io_types": { 00:09:50.774 "read": true, 00:09:50.774 "write": true, 00:09:50.774 "unmap": true, 00:09:50.774 "flush": true, 00:09:50.774 "reset": true, 00:09:50.774 "nvme_admin": false, 00:09:50.774 "nvme_io": false, 00:09:50.774 "nvme_io_md": false, 00:09:50.774 "write_zeroes": true, 00:09:50.774 "zcopy": false, 00:09:50.774 "get_zone_info": false, 00:09:50.774 "zone_management": false, 00:09:50.774 "zone_append": false, 00:09:50.774 "compare": false, 00:09:50.774 "compare_and_write": false, 00:09:50.775 "abort": false, 00:09:50.775 "seek_hole": false, 00:09:50.775 "seek_data": false, 00:09:50.775 "copy": false, 00:09:50.775 "nvme_iov_md": false 00:09:50.775 }, 00:09:50.775 "memory_domains": [ 00:09:50.775 { 00:09:50.775 "dma_device_id": "system", 00:09:50.775 "dma_device_type": 1 00:09:50.775 }, 00:09:50.775 { 00:09:50.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.775 "dma_device_type": 2 00:09:50.775 }, 00:09:50.775 { 00:09:50.775 "dma_device_id": "system", 00:09:50.775 "dma_device_type": 1 00:09:50.775 }, 00:09:50.775 { 00:09:50.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.775 "dma_device_type": 2 00:09:50.775 } 00:09:50.775 ], 00:09:50.775 "driver_specific": { 00:09:50.775 "raid": { 00:09:50.775 "uuid": "b2d93ded-17f7-4cb3-aec0-016dfb60b521", 00:09:50.775 "strip_size_kb": 64, 00:09:50.775 "state": "online", 00:09:50.775 "raid_level": "raid0", 00:09:50.775 "superblock": true, 00:09:50.775 "num_base_bdevs": 2, 00:09:50.775 "num_base_bdevs_discovered": 2, 00:09:50.775 "num_base_bdevs_operational": 2, 00:09:50.775 "base_bdevs_list": [ 00:09:50.775 { 00:09:50.775 "name": "pt1", 00:09:50.775 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:50.775 "is_configured": true, 00:09:50.775 "data_offset": 2048, 00:09:50.775 "data_size": 63488 00:09:50.775 }, 00:09:50.775 { 00:09:50.775 "name": "pt2", 00:09:50.775 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:50.775 "is_configured": true, 00:09:50.775 "data_offset": 2048, 00:09:50.775 "data_size": 63488 00:09:50.775 } 00:09:50.775 ] 00:09:50.775 } 00:09:50.775 } 00:09:50.775 }' 00:09:50.775 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:50.775 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:50.775 pt2' 00:09:50.775 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:50.775 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:50.775 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:51.033 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:51.033 "name": "pt1", 00:09:51.033 "aliases": [ 00:09:51.033 "00000000-0000-0000-0000-000000000001" 00:09:51.033 ], 00:09:51.033 "product_name": "passthru", 00:09:51.033 "block_size": 512, 00:09:51.033 "num_blocks": 65536, 00:09:51.033 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:51.033 "assigned_rate_limits": { 00:09:51.033 "rw_ios_per_sec": 0, 00:09:51.033 "rw_mbytes_per_sec": 0, 00:09:51.033 "r_mbytes_per_sec": 0, 00:09:51.033 "w_mbytes_per_sec": 0 00:09:51.033 }, 00:09:51.033 "claimed": true, 00:09:51.033 "claim_type": "exclusive_write", 00:09:51.033 "zoned": false, 00:09:51.033 "supported_io_types": { 00:09:51.033 "read": true, 00:09:51.033 "write": true, 00:09:51.033 "unmap": true, 00:09:51.033 "flush": true, 00:09:51.033 "reset": true, 00:09:51.033 "nvme_admin": false, 00:09:51.033 "nvme_io": false, 00:09:51.033 "nvme_io_md": false, 00:09:51.033 "write_zeroes": true, 00:09:51.033 "zcopy": true, 00:09:51.033 "get_zone_info": false, 00:09:51.033 "zone_management": false, 00:09:51.033 "zone_append": false, 00:09:51.033 "compare": false, 00:09:51.033 "compare_and_write": false, 00:09:51.033 "abort": true, 00:09:51.033 "seek_hole": false, 00:09:51.033 "seek_data": false, 00:09:51.033 "copy": true, 00:09:51.033 "nvme_iov_md": false 00:09:51.033 }, 00:09:51.033 "memory_domains": [ 00:09:51.033 { 00:09:51.033 "dma_device_id": "system", 00:09:51.033 "dma_device_type": 1 00:09:51.033 }, 00:09:51.033 { 00:09:51.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.033 "dma_device_type": 2 00:09:51.033 } 00:09:51.033 ], 00:09:51.033 "driver_specific": { 00:09:51.033 "passthru": { 00:09:51.033 "name": "pt1", 00:09:51.033 "base_bdev_name": "malloc1" 00:09:51.033 } 00:09:51.033 } 00:09:51.033 }' 00:09:51.033 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.033 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.033 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:51.033 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:51.291 13:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:51.548 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:51.548 "name": "pt2", 00:09:51.548 "aliases": [ 00:09:51.548 "00000000-0000-0000-0000-000000000002" 00:09:51.548 ], 00:09:51.548 "product_name": "passthru", 00:09:51.548 "block_size": 512, 00:09:51.548 "num_blocks": 65536, 00:09:51.548 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:51.548 "assigned_rate_limits": { 00:09:51.548 "rw_ios_per_sec": 0, 00:09:51.548 "rw_mbytes_per_sec": 0, 00:09:51.548 "r_mbytes_per_sec": 0, 00:09:51.548 "w_mbytes_per_sec": 0 00:09:51.548 }, 00:09:51.548 "claimed": true, 00:09:51.548 "claim_type": "exclusive_write", 00:09:51.548 "zoned": false, 00:09:51.548 "supported_io_types": { 00:09:51.548 "read": true, 00:09:51.548 "write": true, 00:09:51.548 "unmap": true, 00:09:51.548 "flush": true, 00:09:51.548 "reset": true, 00:09:51.548 "nvme_admin": false, 00:09:51.548 "nvme_io": false, 00:09:51.548 "nvme_io_md": false, 00:09:51.548 "write_zeroes": true, 00:09:51.548 "zcopy": true, 00:09:51.548 "get_zone_info": false, 00:09:51.548 "zone_management": false, 00:09:51.548 "zone_append": false, 00:09:51.548 "compare": false, 00:09:51.548 "compare_and_write": false, 00:09:51.548 "abort": true, 00:09:51.548 "seek_hole": false, 00:09:51.548 "seek_data": false, 00:09:51.548 "copy": true, 00:09:51.548 "nvme_iov_md": false 00:09:51.548 }, 00:09:51.548 "memory_domains": [ 00:09:51.548 { 00:09:51.548 "dma_device_id": "system", 00:09:51.548 "dma_device_type": 1 00:09:51.548 }, 00:09:51.548 { 00:09:51.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.548 "dma_device_type": 2 00:09:51.548 } 00:09:51.548 ], 00:09:51.548 "driver_specific": { 00:09:51.548 "passthru": { 00:09:51.548 "name": "pt2", 00:09:51.548 "base_bdev_name": "malloc2" 00:09:51.548 } 00:09:51.548 } 00:09:51.548 }' 00:09:51.548 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.548 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.548 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:51.548 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.548 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:51.806 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:09:52.063 [2024-07-15 13:32:39.519527] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:52.063 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b2d93ded-17f7-4cb3-aec0-016dfb60b521 00:09:52.063 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b2d93ded-17f7-4cb3-aec0-016dfb60b521 ']' 00:09:52.063 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:52.320 [2024-07-15 13:32:39.695795] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:52.320 [2024-07-15 13:32:39.695826] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:52.320 [2024-07-15 13:32:39.695871] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:52.320 [2024-07-15 13:32:39.695904] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:52.320 [2024-07-15 13:32:39.695912] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27785b0 name raid_bdev1, state offline 00:09:52.320 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:52.320 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:09:52.320 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:09:52.320 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:09:52.320 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:52.320 13:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:52.577 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:52.577 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:52.835 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:53.094 [2024-07-15 13:32:40.582109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:53.094 [2024-07-15 13:32:40.583173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:53.094 [2024-07-15 13:32:40.583230] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:53.094 [2024-07-15 13:32:40.583263] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:53.094 [2024-07-15 13:32:40.583293] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:53.094 [2024-07-15 13:32:40.583300] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2779d40 name raid_bdev1, state configuring 00:09:53.094 request: 00:09:53.094 { 00:09:53.094 "name": "raid_bdev1", 00:09:53.094 "raid_level": "raid0", 00:09:53.094 "base_bdevs": [ 00:09:53.094 "malloc1", 00:09:53.094 "malloc2" 00:09:53.094 ], 00:09:53.094 "strip_size_kb": 64, 00:09:53.094 "superblock": false, 00:09:53.094 "method": "bdev_raid_create", 00:09:53.094 "req_id": 1 00:09:53.094 } 00:09:53.094 Got JSON-RPC error response 00:09:53.094 response: 00:09:53.094 { 00:09:53.094 "code": -17, 00:09:53.094 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:53.094 } 00:09:53.094 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:09:53.094 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:53.094 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:53.094 13:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:53.094 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:53.094 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:53.352 [2024-07-15 13:32:40.926919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:53.352 [2024-07-15 13:32:40.926964] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:53.352 [2024-07-15 13:32:40.926979] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277c640 00:09:53.352 [2024-07-15 13:32:40.926986] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:53.352 [2024-07-15 13:32:40.928234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:53.352 [2024-07-15 13:32:40.928258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:53.352 [2024-07-15 13:32:40.928317] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:53.352 [2024-07-15 13:32:40.928337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:53.352 pt1 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:53.352 13:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:53.609 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:53.609 "name": "raid_bdev1", 00:09:53.609 "uuid": "b2d93ded-17f7-4cb3-aec0-016dfb60b521", 00:09:53.609 "strip_size_kb": 64, 00:09:53.609 "state": "configuring", 00:09:53.609 "raid_level": "raid0", 00:09:53.609 "superblock": true, 00:09:53.609 "num_base_bdevs": 2, 00:09:53.609 "num_base_bdevs_discovered": 1, 00:09:53.609 "num_base_bdevs_operational": 2, 00:09:53.609 "base_bdevs_list": [ 00:09:53.609 { 00:09:53.609 "name": "pt1", 00:09:53.609 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:53.609 "is_configured": true, 00:09:53.609 "data_offset": 2048, 00:09:53.609 "data_size": 63488 00:09:53.609 }, 00:09:53.609 { 00:09:53.609 "name": null, 00:09:53.609 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:53.609 "is_configured": false, 00:09:53.609 "data_offset": 2048, 00:09:53.609 "data_size": 63488 00:09:53.609 } 00:09:53.609 ] 00:09:53.609 }' 00:09:53.609 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:53.609 13:32:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:54.173 [2024-07-15 13:32:41.753054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:54.173 [2024-07-15 13:32:41.753117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:54.173 [2024-07-15 13:32:41.753132] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25cf490 00:09:54.173 [2024-07-15 13:32:41.753140] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:54.173 [2024-07-15 13:32:41.753407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:54.173 [2024-07-15 13:32:41.753419] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:54.173 [2024-07-15 13:32:41.753468] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:54.173 [2024-07-15 13:32:41.753483] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:54.173 [2024-07-15 13:32:41.753555] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25ce420 00:09:54.173 [2024-07-15 13:32:41.753562] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:54.173 [2024-07-15 13:32:41.753683] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x277c0a0 00:09:54.173 [2024-07-15 13:32:41.753767] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25ce420 00:09:54.173 [2024-07-15 13:32:41.753773] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25ce420 00:09:54.173 [2024-07-15 13:32:41.753844] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:54.173 pt2 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:54.173 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:54.174 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:54.174 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:54.174 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:54.174 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:54.431 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:54.431 "name": "raid_bdev1", 00:09:54.431 "uuid": "b2d93ded-17f7-4cb3-aec0-016dfb60b521", 00:09:54.431 "strip_size_kb": 64, 00:09:54.431 "state": "online", 00:09:54.431 "raid_level": "raid0", 00:09:54.431 "superblock": true, 00:09:54.431 "num_base_bdevs": 2, 00:09:54.431 "num_base_bdevs_discovered": 2, 00:09:54.431 "num_base_bdevs_operational": 2, 00:09:54.431 "base_bdevs_list": [ 00:09:54.431 { 00:09:54.431 "name": "pt1", 00:09:54.431 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:54.431 "is_configured": true, 00:09:54.431 "data_offset": 2048, 00:09:54.431 "data_size": 63488 00:09:54.431 }, 00:09:54.431 { 00:09:54.431 "name": "pt2", 00:09:54.431 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:54.431 "is_configured": true, 00:09:54.431 "data_offset": 2048, 00:09:54.431 "data_size": 63488 00:09:54.431 } 00:09:54.431 ] 00:09:54.431 }' 00:09:54.431 13:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:54.431 13:32:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:54.995 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:54.995 [2024-07-15 13:32:42.591416] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:55.252 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:55.252 "name": "raid_bdev1", 00:09:55.252 "aliases": [ 00:09:55.252 "b2d93ded-17f7-4cb3-aec0-016dfb60b521" 00:09:55.252 ], 00:09:55.252 "product_name": "Raid Volume", 00:09:55.252 "block_size": 512, 00:09:55.252 "num_blocks": 126976, 00:09:55.252 "uuid": "b2d93ded-17f7-4cb3-aec0-016dfb60b521", 00:09:55.252 "assigned_rate_limits": { 00:09:55.252 "rw_ios_per_sec": 0, 00:09:55.252 "rw_mbytes_per_sec": 0, 00:09:55.252 "r_mbytes_per_sec": 0, 00:09:55.252 "w_mbytes_per_sec": 0 00:09:55.252 }, 00:09:55.252 "claimed": false, 00:09:55.252 "zoned": false, 00:09:55.252 "supported_io_types": { 00:09:55.252 "read": true, 00:09:55.252 "write": true, 00:09:55.252 "unmap": true, 00:09:55.252 "flush": true, 00:09:55.252 "reset": true, 00:09:55.252 "nvme_admin": false, 00:09:55.252 "nvme_io": false, 00:09:55.252 "nvme_io_md": false, 00:09:55.252 "write_zeroes": true, 00:09:55.252 "zcopy": false, 00:09:55.252 "get_zone_info": false, 00:09:55.252 "zone_management": false, 00:09:55.252 "zone_append": false, 00:09:55.252 "compare": false, 00:09:55.252 "compare_and_write": false, 00:09:55.252 "abort": false, 00:09:55.252 "seek_hole": false, 00:09:55.252 "seek_data": false, 00:09:55.252 "copy": false, 00:09:55.252 "nvme_iov_md": false 00:09:55.252 }, 00:09:55.252 "memory_domains": [ 00:09:55.252 { 00:09:55.252 "dma_device_id": "system", 00:09:55.252 "dma_device_type": 1 00:09:55.252 }, 00:09:55.252 { 00:09:55.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.252 "dma_device_type": 2 00:09:55.252 }, 00:09:55.252 { 00:09:55.252 "dma_device_id": "system", 00:09:55.252 "dma_device_type": 1 00:09:55.252 }, 00:09:55.252 { 00:09:55.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.252 "dma_device_type": 2 00:09:55.252 } 00:09:55.252 ], 00:09:55.252 "driver_specific": { 00:09:55.252 "raid": { 00:09:55.252 "uuid": "b2d93ded-17f7-4cb3-aec0-016dfb60b521", 00:09:55.252 "strip_size_kb": 64, 00:09:55.252 "state": "online", 00:09:55.252 "raid_level": "raid0", 00:09:55.252 "superblock": true, 00:09:55.252 "num_base_bdevs": 2, 00:09:55.252 "num_base_bdevs_discovered": 2, 00:09:55.252 "num_base_bdevs_operational": 2, 00:09:55.252 "base_bdevs_list": [ 00:09:55.252 { 00:09:55.252 "name": "pt1", 00:09:55.252 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:55.252 "is_configured": true, 00:09:55.252 "data_offset": 2048, 00:09:55.252 "data_size": 63488 00:09:55.252 }, 00:09:55.252 { 00:09:55.252 "name": "pt2", 00:09:55.252 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:55.252 "is_configured": true, 00:09:55.252 "data_offset": 2048, 00:09:55.252 "data_size": 63488 00:09:55.252 } 00:09:55.252 ] 00:09:55.252 } 00:09:55.252 } 00:09:55.252 }' 00:09:55.252 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:55.252 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:55.252 pt2' 00:09:55.252 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:55.252 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:55.252 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:55.252 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:55.252 "name": "pt1", 00:09:55.252 "aliases": [ 00:09:55.252 "00000000-0000-0000-0000-000000000001" 00:09:55.252 ], 00:09:55.252 "product_name": "passthru", 00:09:55.252 "block_size": 512, 00:09:55.252 "num_blocks": 65536, 00:09:55.252 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:55.252 "assigned_rate_limits": { 00:09:55.252 "rw_ios_per_sec": 0, 00:09:55.252 "rw_mbytes_per_sec": 0, 00:09:55.252 "r_mbytes_per_sec": 0, 00:09:55.252 "w_mbytes_per_sec": 0 00:09:55.252 }, 00:09:55.252 "claimed": true, 00:09:55.252 "claim_type": "exclusive_write", 00:09:55.252 "zoned": false, 00:09:55.252 "supported_io_types": { 00:09:55.252 "read": true, 00:09:55.252 "write": true, 00:09:55.252 "unmap": true, 00:09:55.252 "flush": true, 00:09:55.252 "reset": true, 00:09:55.252 "nvme_admin": false, 00:09:55.252 "nvme_io": false, 00:09:55.252 "nvme_io_md": false, 00:09:55.252 "write_zeroes": true, 00:09:55.252 "zcopy": true, 00:09:55.252 "get_zone_info": false, 00:09:55.252 "zone_management": false, 00:09:55.252 "zone_append": false, 00:09:55.252 "compare": false, 00:09:55.252 "compare_and_write": false, 00:09:55.252 "abort": true, 00:09:55.253 "seek_hole": false, 00:09:55.253 "seek_data": false, 00:09:55.253 "copy": true, 00:09:55.253 "nvme_iov_md": false 00:09:55.253 }, 00:09:55.253 "memory_domains": [ 00:09:55.253 { 00:09:55.253 "dma_device_id": "system", 00:09:55.253 "dma_device_type": 1 00:09:55.253 }, 00:09:55.253 { 00:09:55.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.253 "dma_device_type": 2 00:09:55.253 } 00:09:55.253 ], 00:09:55.253 "driver_specific": { 00:09:55.253 "passthru": { 00:09:55.253 "name": "pt1", 00:09:55.253 "base_bdev_name": "malloc1" 00:09:55.253 } 00:09:55.253 } 00:09:55.253 }' 00:09:55.253 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:55.509 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:55.509 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:55.509 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:55.509 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:55.509 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:55.509 13:32:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:55.509 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:55.509 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:55.509 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:55.509 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:55.765 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:55.765 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:55.765 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:55.765 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:55.765 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:55.765 "name": "pt2", 00:09:55.765 "aliases": [ 00:09:55.765 "00000000-0000-0000-0000-000000000002" 00:09:55.765 ], 00:09:55.765 "product_name": "passthru", 00:09:55.765 "block_size": 512, 00:09:55.765 "num_blocks": 65536, 00:09:55.765 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:55.765 "assigned_rate_limits": { 00:09:55.765 "rw_ios_per_sec": 0, 00:09:55.765 "rw_mbytes_per_sec": 0, 00:09:55.765 "r_mbytes_per_sec": 0, 00:09:55.765 "w_mbytes_per_sec": 0 00:09:55.765 }, 00:09:55.765 "claimed": true, 00:09:55.765 "claim_type": "exclusive_write", 00:09:55.765 "zoned": false, 00:09:55.765 "supported_io_types": { 00:09:55.765 "read": true, 00:09:55.765 "write": true, 00:09:55.765 "unmap": true, 00:09:55.765 "flush": true, 00:09:55.765 "reset": true, 00:09:55.765 "nvme_admin": false, 00:09:55.765 "nvme_io": false, 00:09:55.765 "nvme_io_md": false, 00:09:55.765 "write_zeroes": true, 00:09:55.765 "zcopy": true, 00:09:55.765 "get_zone_info": false, 00:09:55.765 "zone_management": false, 00:09:55.765 "zone_append": false, 00:09:55.765 "compare": false, 00:09:55.765 "compare_and_write": false, 00:09:55.765 "abort": true, 00:09:55.765 "seek_hole": false, 00:09:55.765 "seek_data": false, 00:09:55.765 "copy": true, 00:09:55.765 "nvme_iov_md": false 00:09:55.765 }, 00:09:55.765 "memory_domains": [ 00:09:55.765 { 00:09:55.765 "dma_device_id": "system", 00:09:55.765 "dma_device_type": 1 00:09:55.765 }, 00:09:55.765 { 00:09:55.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.765 "dma_device_type": 2 00:09:55.765 } 00:09:55.765 ], 00:09:55.765 "driver_specific": { 00:09:55.765 "passthru": { 00:09:55.765 "name": "pt2", 00:09:55.765 "base_bdev_name": "malloc2" 00:09:55.765 } 00:09:55.765 } 00:09:55.765 }' 00:09:55.765 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:55.765 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:56.022 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:09:56.279 [2024-07-15 13:32:43.766466] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b2d93ded-17f7-4cb3-aec0-016dfb60b521 '!=' b2d93ded-17f7-4cb3-aec0-016dfb60b521 ']' 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4167278 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4167278 ']' 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4167278 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4167278 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4167278' 00:09:56.279 killing process with pid 4167278 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4167278 00:09:56.279 [2024-07-15 13:32:43.830991] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:56.279 13:32:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4167278 00:09:56.279 [2024-07-15 13:32:43.831064] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:56.279 [2024-07-15 13:32:43.831098] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:56.279 [2024-07-15 13:32:43.831107] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25ce420 name raid_bdev1, state offline 00:09:56.279 [2024-07-15 13:32:43.849052] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:56.537 13:32:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:09:56.537 00:09:56.537 real 0m8.304s 00:09:56.537 user 0m14.665s 00:09:56.537 sys 0m1.596s 00:09:56.537 13:32:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:56.537 13:32:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:56.537 ************************************ 00:09:56.537 END TEST raid_superblock_test 00:09:56.537 ************************************ 00:09:56.537 13:32:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:56.537 13:32:44 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:09:56.537 13:32:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:56.537 13:32:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:56.537 13:32:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:56.537 ************************************ 00:09:56.537 START TEST raid_read_error_test 00:09:56.537 ************************************ 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.47HXr0SLvC 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4168561 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4168561 /var/tmp/spdk-raid.sock 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4168561 ']' 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:56.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:56.537 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:56.795 [2024-07-15 13:32:44.176712] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:09:56.795 [2024-07-15 13:32:44.176765] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4168561 ] 00:09:56.795 [2024-07-15 13:32:44.264708] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:56.795 [2024-07-15 13:32:44.357736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:57.052 [2024-07-15 13:32:44.422062] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:57.052 [2024-07-15 13:32:44.422091] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:57.614 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:57.614 13:32:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:09:57.614 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:57.614 13:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:57.614 BaseBdev1_malloc 00:09:57.614 13:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:57.871 true 00:09:57.871 13:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:57.871 [2024-07-15 13:32:45.472803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:57.871 [2024-07-15 13:32:45.472837] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:57.871 [2024-07-15 13:32:45.472867] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e2990 00:09:57.871 [2024-07-15 13:32:45.472876] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:57.871 [2024-07-15 13:32:45.474243] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:57.871 [2024-07-15 13:32:45.474267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:57.871 BaseBdev1 00:09:57.871 13:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:57.871 13:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:58.128 BaseBdev2_malloc 00:09:58.128 13:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:58.385 true 00:09:58.385 13:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:58.641 [2024-07-15 13:32:46.007173] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:58.641 [2024-07-15 13:32:46.007211] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:58.641 [2024-07-15 13:32:46.007227] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e71d0 00:09:58.641 [2024-07-15 13:32:46.007236] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:58.641 [2024-07-15 13:32:46.008327] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:58.641 [2024-07-15 13:32:46.008348] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:58.641 BaseBdev2 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:58.641 [2024-07-15 13:32:46.191673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:58.641 [2024-07-15 13:32:46.192731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:58.641 [2024-07-15 13:32:46.192870] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25e8be0 00:09:58.641 [2024-07-15 13:32:46.192879] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:58.641 [2024-07-15 13:32:46.193032] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e7b30 00:09:58.641 [2024-07-15 13:32:46.193139] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25e8be0 00:09:58.641 [2024-07-15 13:32:46.193146] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25e8be0 00:09:58.641 [2024-07-15 13:32:46.193222] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:58.641 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:58.897 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:58.897 "name": "raid_bdev1", 00:09:58.897 "uuid": "7d7dea98-c31c-4454-9b45-5fe6cf3faf87", 00:09:58.897 "strip_size_kb": 64, 00:09:58.897 "state": "online", 00:09:58.897 "raid_level": "raid0", 00:09:58.897 "superblock": true, 00:09:58.897 "num_base_bdevs": 2, 00:09:58.897 "num_base_bdevs_discovered": 2, 00:09:58.897 "num_base_bdevs_operational": 2, 00:09:58.897 "base_bdevs_list": [ 00:09:58.897 { 00:09:58.897 "name": "BaseBdev1", 00:09:58.897 "uuid": "19b472ff-8373-543a-991b-628362313f2a", 00:09:58.897 "is_configured": true, 00:09:58.897 "data_offset": 2048, 00:09:58.897 "data_size": 63488 00:09:58.897 }, 00:09:58.897 { 00:09:58.897 "name": "BaseBdev2", 00:09:58.897 "uuid": "3e518791-3375-5be1-828b-880bdb61b954", 00:09:58.897 "is_configured": true, 00:09:58.897 "data_offset": 2048, 00:09:58.897 "data_size": 63488 00:09:58.897 } 00:09:58.897 ] 00:09:58.897 }' 00:09:58.897 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:58.897 13:32:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:59.460 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:59.460 13:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:59.460 [2024-07-15 13:32:46.953844] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e4270 00:10:00.390 13:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:00.647 "name": "raid_bdev1", 00:10:00.647 "uuid": "7d7dea98-c31c-4454-9b45-5fe6cf3faf87", 00:10:00.647 "strip_size_kb": 64, 00:10:00.647 "state": "online", 00:10:00.647 "raid_level": "raid0", 00:10:00.647 "superblock": true, 00:10:00.647 "num_base_bdevs": 2, 00:10:00.647 "num_base_bdevs_discovered": 2, 00:10:00.647 "num_base_bdevs_operational": 2, 00:10:00.647 "base_bdevs_list": [ 00:10:00.647 { 00:10:00.647 "name": "BaseBdev1", 00:10:00.647 "uuid": "19b472ff-8373-543a-991b-628362313f2a", 00:10:00.647 "is_configured": true, 00:10:00.647 "data_offset": 2048, 00:10:00.647 "data_size": 63488 00:10:00.647 }, 00:10:00.647 { 00:10:00.647 "name": "BaseBdev2", 00:10:00.647 "uuid": "3e518791-3375-5be1-828b-880bdb61b954", 00:10:00.647 "is_configured": true, 00:10:00.647 "data_offset": 2048, 00:10:00.647 "data_size": 63488 00:10:00.647 } 00:10:00.647 ] 00:10:00.647 }' 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:00.647 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.208 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:01.465 [2024-07-15 13:32:48.902969] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:01.465 [2024-07-15 13:32:48.903007] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:01.465 [2024-07-15 13:32:48.905104] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:01.465 [2024-07-15 13:32:48.905125] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:01.465 [2024-07-15 13:32:48.905150] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:01.465 [2024-07-15 13:32:48.905157] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e8be0 name raid_bdev1, state offline 00:10:01.465 0 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4168561 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4168561 ']' 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4168561 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4168561 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4168561' 00:10:01.465 killing process with pid 4168561 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4168561 00:10:01.465 [2024-07-15 13:32:48.969775] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:01.465 13:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4168561 00:10:01.465 [2024-07-15 13:32:48.979915] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.47HXr0SLvC 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:10:01.723 00:10:01.723 real 0m5.067s 00:10:01.723 user 0m7.619s 00:10:01.723 sys 0m0.909s 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:01.723 13:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.723 ************************************ 00:10:01.723 END TEST raid_read_error_test 00:10:01.723 ************************************ 00:10:01.723 13:32:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:01.723 13:32:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:10:01.723 13:32:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:01.723 13:32:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:01.723 13:32:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:01.723 ************************************ 00:10:01.723 START TEST raid_write_error_test 00:10:01.723 ************************************ 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.HWPF3tI4aC 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4169362 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4169362 /var/tmp/spdk-raid.sock 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4169362 ']' 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:01.723 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:01.723 13:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.723 [2024-07-15 13:32:49.321528] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:01.723 [2024-07-15 13:32:49.321575] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4169362 ] 00:10:01.980 [2024-07-15 13:32:49.405920] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:01.980 [2024-07-15 13:32:49.496605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:01.980 [2024-07-15 13:32:49.557119] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:01.981 [2024-07-15 13:32:49.557144] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:02.544 13:32:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:02.544 13:32:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:02.544 13:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:02.544 13:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:02.801 BaseBdev1_malloc 00:10:02.801 13:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:03.058 true 00:10:03.058 13:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:03.058 [2024-07-15 13:32:50.650399] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:03.058 [2024-07-15 13:32:50.650437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:03.058 [2024-07-15 13:32:50.650453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x291c990 00:10:03.058 [2024-07-15 13:32:50.650462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:03.058 [2024-07-15 13:32:50.651898] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:03.058 [2024-07-15 13:32:50.651921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:03.058 BaseBdev1 00:10:03.058 13:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:03.058 13:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:03.325 BaseBdev2_malloc 00:10:03.325 13:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:03.581 true 00:10:03.581 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:03.581 [2024-07-15 13:32:51.164909] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:03.581 [2024-07-15 13:32:51.164945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:03.581 [2024-07-15 13:32:51.164961] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29211d0 00:10:03.581 [2024-07-15 13:32:51.164986] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:03.581 [2024-07-15 13:32:51.166187] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:03.581 [2024-07-15 13:32:51.166212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:03.581 BaseBdev2 00:10:03.581 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:03.837 [2024-07-15 13:32:51.337389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:03.837 [2024-07-15 13:32:51.338403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:03.837 [2024-07-15 13:32:51.338544] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2922be0 00:10:03.837 [2024-07-15 13:32:51.338554] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:03.838 [2024-07-15 13:32:51.338697] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2921b30 00:10:03.838 [2024-07-15 13:32:51.338805] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2922be0 00:10:03.838 [2024-07-15 13:32:51.338811] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2922be0 00:10:03.838 [2024-07-15 13:32:51.338889] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:03.838 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:04.095 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:04.095 "name": "raid_bdev1", 00:10:04.095 "uuid": "d4c48b6c-4dd8-4dad-ba12-733e5196006c", 00:10:04.095 "strip_size_kb": 64, 00:10:04.095 "state": "online", 00:10:04.095 "raid_level": "raid0", 00:10:04.095 "superblock": true, 00:10:04.095 "num_base_bdevs": 2, 00:10:04.095 "num_base_bdevs_discovered": 2, 00:10:04.095 "num_base_bdevs_operational": 2, 00:10:04.095 "base_bdevs_list": [ 00:10:04.095 { 00:10:04.095 "name": "BaseBdev1", 00:10:04.095 "uuid": "36b5b091-6756-508a-8ddc-6d3547cc3bdf", 00:10:04.095 "is_configured": true, 00:10:04.095 "data_offset": 2048, 00:10:04.095 "data_size": 63488 00:10:04.095 }, 00:10:04.095 { 00:10:04.095 "name": "BaseBdev2", 00:10:04.095 "uuid": "aa0f50eb-bd3d-54c5-aadf-b81effd28a3a", 00:10:04.095 "is_configured": true, 00:10:04.095 "data_offset": 2048, 00:10:04.095 "data_size": 63488 00:10:04.095 } 00:10:04.095 ] 00:10:04.095 }' 00:10:04.095 13:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:04.095 13:32:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:04.660 13:32:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:04.661 13:32:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:04.661 [2024-07-15 13:32:52.115630] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x291e270 00:10:05.592 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:05.861 "name": "raid_bdev1", 00:10:05.861 "uuid": "d4c48b6c-4dd8-4dad-ba12-733e5196006c", 00:10:05.861 "strip_size_kb": 64, 00:10:05.861 "state": "online", 00:10:05.861 "raid_level": "raid0", 00:10:05.861 "superblock": true, 00:10:05.861 "num_base_bdevs": 2, 00:10:05.861 "num_base_bdevs_discovered": 2, 00:10:05.861 "num_base_bdevs_operational": 2, 00:10:05.861 "base_bdevs_list": [ 00:10:05.861 { 00:10:05.861 "name": "BaseBdev1", 00:10:05.861 "uuid": "36b5b091-6756-508a-8ddc-6d3547cc3bdf", 00:10:05.861 "is_configured": true, 00:10:05.861 "data_offset": 2048, 00:10:05.861 "data_size": 63488 00:10:05.861 }, 00:10:05.861 { 00:10:05.861 "name": "BaseBdev2", 00:10:05.861 "uuid": "aa0f50eb-bd3d-54c5-aadf-b81effd28a3a", 00:10:05.861 "is_configured": true, 00:10:05.861 "data_offset": 2048, 00:10:05.861 "data_size": 63488 00:10:05.861 } 00:10:05.861 ] 00:10:05.861 }' 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:05.861 13:32:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.429 13:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:06.687 [2024-07-15 13:32:54.048502] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:06.687 [2024-07-15 13:32:54.048527] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:06.687 [2024-07-15 13:32:54.050880] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:06.687 [2024-07-15 13:32:54.050906] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:06.687 [2024-07-15 13:32:54.050926] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:06.687 [2024-07-15 13:32:54.050934] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2922be0 name raid_bdev1, state offline 00:10:06.687 0 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4169362 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4169362 ']' 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4169362 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4169362 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4169362' 00:10:06.687 killing process with pid 4169362 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4169362 00:10:06.687 [2024-07-15 13:32:54.115006] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:06.687 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4169362 00:10:06.687 [2024-07-15 13:32:54.125273] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.HWPF3tI4aC 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:10:06.945 00:10:06.945 real 0m5.058s 00:10:06.945 user 0m7.616s 00:10:06.945 sys 0m0.888s 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:06.945 13:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.945 ************************************ 00:10:06.945 END TEST raid_write_error_test 00:10:06.945 ************************************ 00:10:06.945 13:32:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:06.945 13:32:54 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:06.945 13:32:54 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:10:06.945 13:32:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:06.945 13:32:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:06.945 13:32:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:06.945 ************************************ 00:10:06.945 START TEST raid_state_function_test 00:10:06.945 ************************************ 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:06.945 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4170166 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4170166' 00:10:06.946 Process raid pid: 4170166 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4170166 /var/tmp/spdk-raid.sock 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4170166 ']' 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:06.946 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:06.946 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.946 [2024-07-15 13:32:54.444072] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:06.946 [2024-07-15 13:32:54.444118] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:06.946 [2024-07-15 13:32:54.532338] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:07.204 [2024-07-15 13:32:54.625112] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.204 [2024-07-15 13:32:54.684676] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:07.204 [2024-07-15 13:32:54.684701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:07.769 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:07.769 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:07.769 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:08.026 [2024-07-15 13:32:55.388257] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:08.026 [2024-07-15 13:32:55.388299] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:08.026 [2024-07-15 13:32:55.388318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:08.026 [2024-07-15 13:32:55.388326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:08.026 "name": "Existed_Raid", 00:10:08.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:08.026 "strip_size_kb": 64, 00:10:08.026 "state": "configuring", 00:10:08.026 "raid_level": "concat", 00:10:08.026 "superblock": false, 00:10:08.026 "num_base_bdevs": 2, 00:10:08.026 "num_base_bdevs_discovered": 0, 00:10:08.026 "num_base_bdevs_operational": 2, 00:10:08.026 "base_bdevs_list": [ 00:10:08.026 { 00:10:08.026 "name": "BaseBdev1", 00:10:08.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:08.026 "is_configured": false, 00:10:08.026 "data_offset": 0, 00:10:08.026 "data_size": 0 00:10:08.026 }, 00:10:08.026 { 00:10:08.026 "name": "BaseBdev2", 00:10:08.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:08.026 "is_configured": false, 00:10:08.026 "data_offset": 0, 00:10:08.026 "data_size": 0 00:10:08.026 } 00:10:08.026 ] 00:10:08.026 }' 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:08.026 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:08.591 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:08.591 [2024-07-15 13:32:56.186232] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:08.591 [2024-07-15 13:32:56.186262] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa25f30 name Existed_Raid, state configuring 00:10:08.591 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:08.848 [2024-07-15 13:32:56.358686] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:08.848 [2024-07-15 13:32:56.358713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:08.848 [2024-07-15 13:32:56.358719] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:08.848 [2024-07-15 13:32:56.358727] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:08.848 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:09.124 [2024-07-15 13:32:56.543904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:09.124 BaseBdev1 00:10:09.124 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:09.124 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:09.124 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:09.124 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:09.124 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:09.124 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:09.124 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:09.419 [ 00:10:09.419 { 00:10:09.419 "name": "BaseBdev1", 00:10:09.419 "aliases": [ 00:10:09.419 "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d" 00:10:09.419 ], 00:10:09.419 "product_name": "Malloc disk", 00:10:09.419 "block_size": 512, 00:10:09.419 "num_blocks": 65536, 00:10:09.419 "uuid": "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d", 00:10:09.419 "assigned_rate_limits": { 00:10:09.419 "rw_ios_per_sec": 0, 00:10:09.419 "rw_mbytes_per_sec": 0, 00:10:09.419 "r_mbytes_per_sec": 0, 00:10:09.419 "w_mbytes_per_sec": 0 00:10:09.419 }, 00:10:09.419 "claimed": true, 00:10:09.419 "claim_type": "exclusive_write", 00:10:09.419 "zoned": false, 00:10:09.419 "supported_io_types": { 00:10:09.419 "read": true, 00:10:09.419 "write": true, 00:10:09.419 "unmap": true, 00:10:09.419 "flush": true, 00:10:09.419 "reset": true, 00:10:09.419 "nvme_admin": false, 00:10:09.419 "nvme_io": false, 00:10:09.419 "nvme_io_md": false, 00:10:09.419 "write_zeroes": true, 00:10:09.419 "zcopy": true, 00:10:09.419 "get_zone_info": false, 00:10:09.419 "zone_management": false, 00:10:09.419 "zone_append": false, 00:10:09.419 "compare": false, 00:10:09.419 "compare_and_write": false, 00:10:09.419 "abort": true, 00:10:09.419 "seek_hole": false, 00:10:09.419 "seek_data": false, 00:10:09.419 "copy": true, 00:10:09.419 "nvme_iov_md": false 00:10:09.419 }, 00:10:09.419 "memory_domains": [ 00:10:09.419 { 00:10:09.419 "dma_device_id": "system", 00:10:09.419 "dma_device_type": 1 00:10:09.419 }, 00:10:09.419 { 00:10:09.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:09.419 "dma_device_type": 2 00:10:09.419 } 00:10:09.419 ], 00:10:09.419 "driver_specific": {} 00:10:09.419 } 00:10:09.419 ] 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:09.419 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:09.684 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:09.684 "name": "Existed_Raid", 00:10:09.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.684 "strip_size_kb": 64, 00:10:09.684 "state": "configuring", 00:10:09.684 "raid_level": "concat", 00:10:09.684 "superblock": false, 00:10:09.684 "num_base_bdevs": 2, 00:10:09.684 "num_base_bdevs_discovered": 1, 00:10:09.684 "num_base_bdevs_operational": 2, 00:10:09.684 "base_bdevs_list": [ 00:10:09.684 { 00:10:09.684 "name": "BaseBdev1", 00:10:09.684 "uuid": "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d", 00:10:09.684 "is_configured": true, 00:10:09.684 "data_offset": 0, 00:10:09.684 "data_size": 65536 00:10:09.684 }, 00:10:09.684 { 00:10:09.684 "name": "BaseBdev2", 00:10:09.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.684 "is_configured": false, 00:10:09.684 "data_offset": 0, 00:10:09.684 "data_size": 0 00:10:09.684 } 00:10:09.684 ] 00:10:09.684 }' 00:10:09.684 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:09.684 13:32:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:09.943 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:10.201 [2024-07-15 13:32:57.706905] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:10.201 [2024-07-15 13:32:57.706937] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa25820 name Existed_Raid, state configuring 00:10:10.201 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:10.460 [2024-07-15 13:32:57.875372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:10.460 [2024-07-15 13:32:57.876443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:10.460 [2024-07-15 13:32:57.876469] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:10.460 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:10.719 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:10.719 "name": "Existed_Raid", 00:10:10.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:10.719 "strip_size_kb": 64, 00:10:10.719 "state": "configuring", 00:10:10.719 "raid_level": "concat", 00:10:10.719 "superblock": false, 00:10:10.719 "num_base_bdevs": 2, 00:10:10.719 "num_base_bdevs_discovered": 1, 00:10:10.719 "num_base_bdevs_operational": 2, 00:10:10.719 "base_bdevs_list": [ 00:10:10.719 { 00:10:10.719 "name": "BaseBdev1", 00:10:10.719 "uuid": "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d", 00:10:10.719 "is_configured": true, 00:10:10.719 "data_offset": 0, 00:10:10.719 "data_size": 65536 00:10:10.719 }, 00:10:10.719 { 00:10:10.719 "name": "BaseBdev2", 00:10:10.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:10.719 "is_configured": false, 00:10:10.719 "data_offset": 0, 00:10:10.719 "data_size": 0 00:10:10.719 } 00:10:10.719 ] 00:10:10.719 }' 00:10:10.719 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:10.719 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:10.978 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:11.237 [2024-07-15 13:32:58.744364] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:11.237 [2024-07-15 13:32:58.744396] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa26610 00:10:11.237 [2024-07-15 13:32:58.744401] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:11.237 [2024-07-15 13:32:58.744524] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbc9fb0 00:10:11.237 [2024-07-15 13:32:58.744605] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa26610 00:10:11.237 [2024-07-15 13:32:58.744611] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa26610 00:10:11.237 [2024-07-15 13:32:58.744733] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:11.237 BaseBdev2 00:10:11.237 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:11.237 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:11.237 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:11.237 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:11.237 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:11.237 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:11.237 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:11.497 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:11.497 [ 00:10:11.497 { 00:10:11.497 "name": "BaseBdev2", 00:10:11.497 "aliases": [ 00:10:11.497 "4845e09d-2794-4e3c-b58e-163f187750cd" 00:10:11.497 ], 00:10:11.497 "product_name": "Malloc disk", 00:10:11.497 "block_size": 512, 00:10:11.497 "num_blocks": 65536, 00:10:11.497 "uuid": "4845e09d-2794-4e3c-b58e-163f187750cd", 00:10:11.497 "assigned_rate_limits": { 00:10:11.497 "rw_ios_per_sec": 0, 00:10:11.497 "rw_mbytes_per_sec": 0, 00:10:11.497 "r_mbytes_per_sec": 0, 00:10:11.497 "w_mbytes_per_sec": 0 00:10:11.497 }, 00:10:11.497 "claimed": true, 00:10:11.497 "claim_type": "exclusive_write", 00:10:11.497 "zoned": false, 00:10:11.497 "supported_io_types": { 00:10:11.497 "read": true, 00:10:11.497 "write": true, 00:10:11.497 "unmap": true, 00:10:11.497 "flush": true, 00:10:11.497 "reset": true, 00:10:11.497 "nvme_admin": false, 00:10:11.497 "nvme_io": false, 00:10:11.497 "nvme_io_md": false, 00:10:11.497 "write_zeroes": true, 00:10:11.497 "zcopy": true, 00:10:11.497 "get_zone_info": false, 00:10:11.497 "zone_management": false, 00:10:11.497 "zone_append": false, 00:10:11.497 "compare": false, 00:10:11.497 "compare_and_write": false, 00:10:11.497 "abort": true, 00:10:11.497 "seek_hole": false, 00:10:11.497 "seek_data": false, 00:10:11.497 "copy": true, 00:10:11.497 "nvme_iov_md": false 00:10:11.497 }, 00:10:11.497 "memory_domains": [ 00:10:11.497 { 00:10:11.497 "dma_device_id": "system", 00:10:11.497 "dma_device_type": 1 00:10:11.497 }, 00:10:11.497 { 00:10:11.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:11.497 "dma_device_type": 2 00:10:11.497 } 00:10:11.497 ], 00:10:11.497 "driver_specific": {} 00:10:11.497 } 00:10:11.497 ] 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:11.497 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:11.757 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:11.757 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:11.757 "name": "Existed_Raid", 00:10:11.757 "uuid": "13d93776-bdc4-4c7d-9d2b-7bbc8aad9a08", 00:10:11.757 "strip_size_kb": 64, 00:10:11.757 "state": "online", 00:10:11.757 "raid_level": "concat", 00:10:11.757 "superblock": false, 00:10:11.757 "num_base_bdevs": 2, 00:10:11.757 "num_base_bdevs_discovered": 2, 00:10:11.757 "num_base_bdevs_operational": 2, 00:10:11.757 "base_bdevs_list": [ 00:10:11.757 { 00:10:11.757 "name": "BaseBdev1", 00:10:11.757 "uuid": "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d", 00:10:11.757 "is_configured": true, 00:10:11.757 "data_offset": 0, 00:10:11.757 "data_size": 65536 00:10:11.757 }, 00:10:11.757 { 00:10:11.757 "name": "BaseBdev2", 00:10:11.757 "uuid": "4845e09d-2794-4e3c-b58e-163f187750cd", 00:10:11.757 "is_configured": true, 00:10:11.757 "data_offset": 0, 00:10:11.757 "data_size": 65536 00:10:11.757 } 00:10:11.757 ] 00:10:11.757 }' 00:10:11.757 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:11.757 13:32:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:12.325 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:12.585 [2024-07-15 13:32:59.951659] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:12.585 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:12.585 "name": "Existed_Raid", 00:10:12.585 "aliases": [ 00:10:12.585 "13d93776-bdc4-4c7d-9d2b-7bbc8aad9a08" 00:10:12.585 ], 00:10:12.585 "product_name": "Raid Volume", 00:10:12.585 "block_size": 512, 00:10:12.585 "num_blocks": 131072, 00:10:12.585 "uuid": "13d93776-bdc4-4c7d-9d2b-7bbc8aad9a08", 00:10:12.585 "assigned_rate_limits": { 00:10:12.585 "rw_ios_per_sec": 0, 00:10:12.585 "rw_mbytes_per_sec": 0, 00:10:12.585 "r_mbytes_per_sec": 0, 00:10:12.585 "w_mbytes_per_sec": 0 00:10:12.585 }, 00:10:12.585 "claimed": false, 00:10:12.585 "zoned": false, 00:10:12.585 "supported_io_types": { 00:10:12.585 "read": true, 00:10:12.585 "write": true, 00:10:12.585 "unmap": true, 00:10:12.585 "flush": true, 00:10:12.585 "reset": true, 00:10:12.585 "nvme_admin": false, 00:10:12.585 "nvme_io": false, 00:10:12.585 "nvme_io_md": false, 00:10:12.585 "write_zeroes": true, 00:10:12.585 "zcopy": false, 00:10:12.585 "get_zone_info": false, 00:10:12.585 "zone_management": false, 00:10:12.585 "zone_append": false, 00:10:12.585 "compare": false, 00:10:12.585 "compare_and_write": false, 00:10:12.585 "abort": false, 00:10:12.585 "seek_hole": false, 00:10:12.585 "seek_data": false, 00:10:12.585 "copy": false, 00:10:12.585 "nvme_iov_md": false 00:10:12.585 }, 00:10:12.585 "memory_domains": [ 00:10:12.585 { 00:10:12.585 "dma_device_id": "system", 00:10:12.585 "dma_device_type": 1 00:10:12.585 }, 00:10:12.585 { 00:10:12.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:12.585 "dma_device_type": 2 00:10:12.585 }, 00:10:12.585 { 00:10:12.585 "dma_device_id": "system", 00:10:12.585 "dma_device_type": 1 00:10:12.585 }, 00:10:12.585 { 00:10:12.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:12.585 "dma_device_type": 2 00:10:12.585 } 00:10:12.585 ], 00:10:12.585 "driver_specific": { 00:10:12.585 "raid": { 00:10:12.585 "uuid": "13d93776-bdc4-4c7d-9d2b-7bbc8aad9a08", 00:10:12.585 "strip_size_kb": 64, 00:10:12.585 "state": "online", 00:10:12.585 "raid_level": "concat", 00:10:12.585 "superblock": false, 00:10:12.585 "num_base_bdevs": 2, 00:10:12.585 "num_base_bdevs_discovered": 2, 00:10:12.585 "num_base_bdevs_operational": 2, 00:10:12.585 "base_bdevs_list": [ 00:10:12.585 { 00:10:12.585 "name": "BaseBdev1", 00:10:12.585 "uuid": "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d", 00:10:12.585 "is_configured": true, 00:10:12.585 "data_offset": 0, 00:10:12.585 "data_size": 65536 00:10:12.585 }, 00:10:12.585 { 00:10:12.585 "name": "BaseBdev2", 00:10:12.585 "uuid": "4845e09d-2794-4e3c-b58e-163f187750cd", 00:10:12.585 "is_configured": true, 00:10:12.585 "data_offset": 0, 00:10:12.585 "data_size": 65536 00:10:12.585 } 00:10:12.585 ] 00:10:12.585 } 00:10:12.585 } 00:10:12.585 }' 00:10:12.585 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:12.585 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:12.585 BaseBdev2' 00:10:12.585 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:12.585 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:12.585 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:12.585 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:12.585 "name": "BaseBdev1", 00:10:12.585 "aliases": [ 00:10:12.585 "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d" 00:10:12.585 ], 00:10:12.585 "product_name": "Malloc disk", 00:10:12.585 "block_size": 512, 00:10:12.585 "num_blocks": 65536, 00:10:12.585 "uuid": "1ac47bf5-d13a-420c-9fc7-6f8b0c82258d", 00:10:12.585 "assigned_rate_limits": { 00:10:12.585 "rw_ios_per_sec": 0, 00:10:12.585 "rw_mbytes_per_sec": 0, 00:10:12.585 "r_mbytes_per_sec": 0, 00:10:12.585 "w_mbytes_per_sec": 0 00:10:12.585 }, 00:10:12.585 "claimed": true, 00:10:12.585 "claim_type": "exclusive_write", 00:10:12.585 "zoned": false, 00:10:12.585 "supported_io_types": { 00:10:12.585 "read": true, 00:10:12.585 "write": true, 00:10:12.585 "unmap": true, 00:10:12.585 "flush": true, 00:10:12.585 "reset": true, 00:10:12.585 "nvme_admin": false, 00:10:12.585 "nvme_io": false, 00:10:12.585 "nvme_io_md": false, 00:10:12.585 "write_zeroes": true, 00:10:12.585 "zcopy": true, 00:10:12.585 "get_zone_info": false, 00:10:12.585 "zone_management": false, 00:10:12.585 "zone_append": false, 00:10:12.585 "compare": false, 00:10:12.585 "compare_and_write": false, 00:10:12.585 "abort": true, 00:10:12.585 "seek_hole": false, 00:10:12.585 "seek_data": false, 00:10:12.585 "copy": true, 00:10:12.585 "nvme_iov_md": false 00:10:12.585 }, 00:10:12.585 "memory_domains": [ 00:10:12.585 { 00:10:12.585 "dma_device_id": "system", 00:10:12.585 "dma_device_type": 1 00:10:12.585 }, 00:10:12.585 { 00:10:12.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:12.585 "dma_device_type": 2 00:10:12.585 } 00:10:12.585 ], 00:10:12.585 "driver_specific": {} 00:10:12.585 }' 00:10:12.585 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:12.845 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:13.104 "name": "BaseBdev2", 00:10:13.104 "aliases": [ 00:10:13.104 "4845e09d-2794-4e3c-b58e-163f187750cd" 00:10:13.104 ], 00:10:13.104 "product_name": "Malloc disk", 00:10:13.104 "block_size": 512, 00:10:13.104 "num_blocks": 65536, 00:10:13.104 "uuid": "4845e09d-2794-4e3c-b58e-163f187750cd", 00:10:13.104 "assigned_rate_limits": { 00:10:13.104 "rw_ios_per_sec": 0, 00:10:13.104 "rw_mbytes_per_sec": 0, 00:10:13.104 "r_mbytes_per_sec": 0, 00:10:13.104 "w_mbytes_per_sec": 0 00:10:13.104 }, 00:10:13.104 "claimed": true, 00:10:13.104 "claim_type": "exclusive_write", 00:10:13.104 "zoned": false, 00:10:13.104 "supported_io_types": { 00:10:13.104 "read": true, 00:10:13.104 "write": true, 00:10:13.104 "unmap": true, 00:10:13.104 "flush": true, 00:10:13.104 "reset": true, 00:10:13.104 "nvme_admin": false, 00:10:13.104 "nvme_io": false, 00:10:13.104 "nvme_io_md": false, 00:10:13.104 "write_zeroes": true, 00:10:13.104 "zcopy": true, 00:10:13.104 "get_zone_info": false, 00:10:13.104 "zone_management": false, 00:10:13.104 "zone_append": false, 00:10:13.104 "compare": false, 00:10:13.104 "compare_and_write": false, 00:10:13.104 "abort": true, 00:10:13.104 "seek_hole": false, 00:10:13.104 "seek_data": false, 00:10:13.104 "copy": true, 00:10:13.104 "nvme_iov_md": false 00:10:13.104 }, 00:10:13.104 "memory_domains": [ 00:10:13.104 { 00:10:13.104 "dma_device_id": "system", 00:10:13.104 "dma_device_type": 1 00:10:13.104 }, 00:10:13.104 { 00:10:13.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.104 "dma_device_type": 2 00:10:13.104 } 00:10:13.104 ], 00:10:13.104 "driver_specific": {} 00:10:13.104 }' 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:13.104 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:13.363 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:13.621 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:13.621 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:13.621 [2024-07-15 13:33:01.138590] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:13.621 [2024-07-15 13:33:01.138611] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:13.621 [2024-07-15 13:33:01.138641] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:13.621 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:13.622 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:13.880 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:13.880 "name": "Existed_Raid", 00:10:13.880 "uuid": "13d93776-bdc4-4c7d-9d2b-7bbc8aad9a08", 00:10:13.880 "strip_size_kb": 64, 00:10:13.880 "state": "offline", 00:10:13.880 "raid_level": "concat", 00:10:13.880 "superblock": false, 00:10:13.880 "num_base_bdevs": 2, 00:10:13.880 "num_base_bdevs_discovered": 1, 00:10:13.880 "num_base_bdevs_operational": 1, 00:10:13.880 "base_bdevs_list": [ 00:10:13.880 { 00:10:13.880 "name": null, 00:10:13.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:13.880 "is_configured": false, 00:10:13.880 "data_offset": 0, 00:10:13.880 "data_size": 65536 00:10:13.880 }, 00:10:13.880 { 00:10:13.880 "name": "BaseBdev2", 00:10:13.880 "uuid": "4845e09d-2794-4e3c-b58e-163f187750cd", 00:10:13.880 "is_configured": true, 00:10:13.880 "data_offset": 0, 00:10:13.880 "data_size": 65536 00:10:13.880 } 00:10:13.880 ] 00:10:13.880 }' 00:10:13.880 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:13.880 13:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:14.446 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:14.446 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:14.446 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:14.446 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:14.446 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:14.446 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:14.446 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:14.704 [2024-07-15 13:33:02.190092] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:14.704 [2024-07-15 13:33:02.190133] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa26610 name Existed_Raid, state offline 00:10:14.704 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:14.704 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:14.704 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:14.704 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4170166 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4170166 ']' 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4170166 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4170166 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4170166' 00:10:14.962 killing process with pid 4170166 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4170166 00:10:14.962 [2024-07-15 13:33:02.436636] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:14.962 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4170166 00:10:14.962 [2024-07-15 13:33:02.437421] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:15.220 00:10:15.220 real 0m8.226s 00:10:15.220 user 0m14.464s 00:10:15.220 sys 0m1.592s 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:15.220 ************************************ 00:10:15.220 END TEST raid_state_function_test 00:10:15.220 ************************************ 00:10:15.220 13:33:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:15.220 13:33:02 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:10:15.220 13:33:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:15.220 13:33:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:15.220 13:33:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:15.220 ************************************ 00:10:15.220 START TEST raid_state_function_test_sb 00:10:15.220 ************************************ 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4171588 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4171588' 00:10:15.220 Process raid pid: 4171588 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4171588 /var/tmp/spdk-raid.sock 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4171588 ']' 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:15.220 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:15.220 13:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:15.220 [2024-07-15 13:33:02.745445] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:15.220 [2024-07-15 13:33:02.745491] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:15.220 [2024-07-15 13:33:02.829934] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:15.478 [2024-07-15 13:33:02.922740] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:15.478 [2024-07-15 13:33:02.982658] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:15.478 [2024-07-15 13:33:02.982683] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:16.043 13:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:16.043 13:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:16.043 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:16.302 [2024-07-15 13:33:03.714429] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:16.302 [2024-07-15 13:33:03.714461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:16.302 [2024-07-15 13:33:03.714468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:16.302 [2024-07-15 13:33:03.714491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:16.302 "name": "Existed_Raid", 00:10:16.302 "uuid": "c7a2c0e4-bf15-443b-907d-c741be6c3c43", 00:10:16.302 "strip_size_kb": 64, 00:10:16.302 "state": "configuring", 00:10:16.302 "raid_level": "concat", 00:10:16.302 "superblock": true, 00:10:16.302 "num_base_bdevs": 2, 00:10:16.302 "num_base_bdevs_discovered": 0, 00:10:16.302 "num_base_bdevs_operational": 2, 00:10:16.302 "base_bdevs_list": [ 00:10:16.302 { 00:10:16.302 "name": "BaseBdev1", 00:10:16.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:16.302 "is_configured": false, 00:10:16.302 "data_offset": 0, 00:10:16.302 "data_size": 0 00:10:16.302 }, 00:10:16.302 { 00:10:16.302 "name": "BaseBdev2", 00:10:16.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:16.302 "is_configured": false, 00:10:16.302 "data_offset": 0, 00:10:16.302 "data_size": 0 00:10:16.302 } 00:10:16.302 ] 00:10:16.302 }' 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:16.302 13:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:16.867 13:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:17.125 [2024-07-15 13:33:04.528448] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:17.125 [2024-07-15 13:33:04.528475] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc50f30 name Existed_Raid, state configuring 00:10:17.125 13:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:17.125 [2024-07-15 13:33:04.712937] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:17.125 [2024-07-15 13:33:04.712959] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:17.125 [2024-07-15 13:33:04.712966] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:17.125 [2024-07-15 13:33:04.712973] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:17.125 13:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:17.382 [2024-07-15 13:33:04.906323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:17.382 BaseBdev1 00:10:17.382 13:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:17.382 13:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:17.382 13:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:17.382 13:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:17.382 13:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:17.382 13:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:17.382 13:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:17.639 13:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:17.898 [ 00:10:17.898 { 00:10:17.898 "name": "BaseBdev1", 00:10:17.898 "aliases": [ 00:10:17.898 "4de1d366-0970-4897-801e-3bb41da3b758" 00:10:17.898 ], 00:10:17.898 "product_name": "Malloc disk", 00:10:17.898 "block_size": 512, 00:10:17.898 "num_blocks": 65536, 00:10:17.898 "uuid": "4de1d366-0970-4897-801e-3bb41da3b758", 00:10:17.898 "assigned_rate_limits": { 00:10:17.898 "rw_ios_per_sec": 0, 00:10:17.898 "rw_mbytes_per_sec": 0, 00:10:17.898 "r_mbytes_per_sec": 0, 00:10:17.898 "w_mbytes_per_sec": 0 00:10:17.898 }, 00:10:17.898 "claimed": true, 00:10:17.898 "claim_type": "exclusive_write", 00:10:17.898 "zoned": false, 00:10:17.898 "supported_io_types": { 00:10:17.898 "read": true, 00:10:17.898 "write": true, 00:10:17.898 "unmap": true, 00:10:17.898 "flush": true, 00:10:17.898 "reset": true, 00:10:17.898 "nvme_admin": false, 00:10:17.898 "nvme_io": false, 00:10:17.898 "nvme_io_md": false, 00:10:17.898 "write_zeroes": true, 00:10:17.898 "zcopy": true, 00:10:17.898 "get_zone_info": false, 00:10:17.898 "zone_management": false, 00:10:17.898 "zone_append": false, 00:10:17.898 "compare": false, 00:10:17.898 "compare_and_write": false, 00:10:17.898 "abort": true, 00:10:17.898 "seek_hole": false, 00:10:17.898 "seek_data": false, 00:10:17.898 "copy": true, 00:10:17.898 "nvme_iov_md": false 00:10:17.898 }, 00:10:17.898 "memory_domains": [ 00:10:17.898 { 00:10:17.898 "dma_device_id": "system", 00:10:17.898 "dma_device_type": 1 00:10:17.898 }, 00:10:17.898 { 00:10:17.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.898 "dma_device_type": 2 00:10:17.898 } 00:10:17.898 ], 00:10:17.898 "driver_specific": {} 00:10:17.898 } 00:10:17.898 ] 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:17.898 "name": "Existed_Raid", 00:10:17.898 "uuid": "87942ebe-b764-4543-a727-6718c0f5d848", 00:10:17.898 "strip_size_kb": 64, 00:10:17.898 "state": "configuring", 00:10:17.898 "raid_level": "concat", 00:10:17.898 "superblock": true, 00:10:17.898 "num_base_bdevs": 2, 00:10:17.898 "num_base_bdevs_discovered": 1, 00:10:17.898 "num_base_bdevs_operational": 2, 00:10:17.898 "base_bdevs_list": [ 00:10:17.898 { 00:10:17.898 "name": "BaseBdev1", 00:10:17.898 "uuid": "4de1d366-0970-4897-801e-3bb41da3b758", 00:10:17.898 "is_configured": true, 00:10:17.898 "data_offset": 2048, 00:10:17.898 "data_size": 63488 00:10:17.898 }, 00:10:17.898 { 00:10:17.898 "name": "BaseBdev2", 00:10:17.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.898 "is_configured": false, 00:10:17.898 "data_offset": 0, 00:10:17.898 "data_size": 0 00:10:17.898 } 00:10:17.898 ] 00:10:17.898 }' 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:17.898 13:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:18.465 13:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:18.722 [2024-07-15 13:33:06.085411] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:18.722 [2024-07-15 13:33:06.085453] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc50820 name Existed_Raid, state configuring 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:18.722 [2024-07-15 13:33:06.269897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:18.722 [2024-07-15 13:33:06.271002] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:18.722 [2024-07-15 13:33:06.271028] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:18.722 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:18.979 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:18.979 "name": "Existed_Raid", 00:10:18.979 "uuid": "c979e4e1-dc6e-4cbe-a0d3-98dae4c6aea6", 00:10:18.979 "strip_size_kb": 64, 00:10:18.979 "state": "configuring", 00:10:18.979 "raid_level": "concat", 00:10:18.979 "superblock": true, 00:10:18.979 "num_base_bdevs": 2, 00:10:18.979 "num_base_bdevs_discovered": 1, 00:10:18.979 "num_base_bdevs_operational": 2, 00:10:18.979 "base_bdevs_list": [ 00:10:18.979 { 00:10:18.979 "name": "BaseBdev1", 00:10:18.979 "uuid": "4de1d366-0970-4897-801e-3bb41da3b758", 00:10:18.979 "is_configured": true, 00:10:18.979 "data_offset": 2048, 00:10:18.979 "data_size": 63488 00:10:18.979 }, 00:10:18.979 { 00:10:18.979 "name": "BaseBdev2", 00:10:18.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:18.979 "is_configured": false, 00:10:18.979 "data_offset": 0, 00:10:18.979 "data_size": 0 00:10:18.979 } 00:10:18.979 ] 00:10:18.979 }' 00:10:18.979 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:18.979 13:33:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:19.543 13:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:19.544 [2024-07-15 13:33:07.134878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:19.544 [2024-07-15 13:33:07.134992] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc51610 00:10:19.544 [2024-07-15 13:33:07.135008] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:19.544 [2024-07-15 13:33:07.135148] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe05000 00:10:19.544 [2024-07-15 13:33:07.135238] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc51610 00:10:19.544 [2024-07-15 13:33:07.135245] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc51610 00:10:19.544 [2024-07-15 13:33:07.135310] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:19.544 BaseBdev2 00:10:19.544 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:19.544 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:19.544 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:19.544 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:19.544 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:19.544 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:19.544 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:19.801 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:20.058 [ 00:10:20.058 { 00:10:20.058 "name": "BaseBdev2", 00:10:20.058 "aliases": [ 00:10:20.058 "2bc89f1f-d4cb-44d1-9095-5bac26c49775" 00:10:20.058 ], 00:10:20.058 "product_name": "Malloc disk", 00:10:20.058 "block_size": 512, 00:10:20.058 "num_blocks": 65536, 00:10:20.058 "uuid": "2bc89f1f-d4cb-44d1-9095-5bac26c49775", 00:10:20.058 "assigned_rate_limits": { 00:10:20.058 "rw_ios_per_sec": 0, 00:10:20.058 "rw_mbytes_per_sec": 0, 00:10:20.058 "r_mbytes_per_sec": 0, 00:10:20.058 "w_mbytes_per_sec": 0 00:10:20.058 }, 00:10:20.058 "claimed": true, 00:10:20.058 "claim_type": "exclusive_write", 00:10:20.058 "zoned": false, 00:10:20.058 "supported_io_types": { 00:10:20.058 "read": true, 00:10:20.058 "write": true, 00:10:20.058 "unmap": true, 00:10:20.058 "flush": true, 00:10:20.058 "reset": true, 00:10:20.058 "nvme_admin": false, 00:10:20.058 "nvme_io": false, 00:10:20.058 "nvme_io_md": false, 00:10:20.058 "write_zeroes": true, 00:10:20.058 "zcopy": true, 00:10:20.058 "get_zone_info": false, 00:10:20.058 "zone_management": false, 00:10:20.058 "zone_append": false, 00:10:20.058 "compare": false, 00:10:20.058 "compare_and_write": false, 00:10:20.058 "abort": true, 00:10:20.058 "seek_hole": false, 00:10:20.058 "seek_data": false, 00:10:20.058 "copy": true, 00:10:20.058 "nvme_iov_md": false 00:10:20.058 }, 00:10:20.058 "memory_domains": [ 00:10:20.058 { 00:10:20.058 "dma_device_id": "system", 00:10:20.058 "dma_device_type": 1 00:10:20.058 }, 00:10:20.058 { 00:10:20.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:20.058 "dma_device_type": 2 00:10:20.058 } 00:10:20.058 ], 00:10:20.058 "driver_specific": {} 00:10:20.058 } 00:10:20.058 ] 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:20.058 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:20.059 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:20.059 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:20.059 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:20.316 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:20.316 "name": "Existed_Raid", 00:10:20.316 "uuid": "c979e4e1-dc6e-4cbe-a0d3-98dae4c6aea6", 00:10:20.316 "strip_size_kb": 64, 00:10:20.316 "state": "online", 00:10:20.316 "raid_level": "concat", 00:10:20.316 "superblock": true, 00:10:20.316 "num_base_bdevs": 2, 00:10:20.316 "num_base_bdevs_discovered": 2, 00:10:20.316 "num_base_bdevs_operational": 2, 00:10:20.316 "base_bdevs_list": [ 00:10:20.316 { 00:10:20.316 "name": "BaseBdev1", 00:10:20.316 "uuid": "4de1d366-0970-4897-801e-3bb41da3b758", 00:10:20.316 "is_configured": true, 00:10:20.316 "data_offset": 2048, 00:10:20.316 "data_size": 63488 00:10:20.316 }, 00:10:20.316 { 00:10:20.316 "name": "BaseBdev2", 00:10:20.316 "uuid": "2bc89f1f-d4cb-44d1-9095-5bac26c49775", 00:10:20.316 "is_configured": true, 00:10:20.316 "data_offset": 2048, 00:10:20.316 "data_size": 63488 00:10:20.316 } 00:10:20.316 ] 00:10:20.316 }' 00:10:20.316 13:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:20.316 13:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:20.574 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:20.832 [2024-07-15 13:33:08.334158] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:20.832 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:20.832 "name": "Existed_Raid", 00:10:20.832 "aliases": [ 00:10:20.832 "c979e4e1-dc6e-4cbe-a0d3-98dae4c6aea6" 00:10:20.832 ], 00:10:20.832 "product_name": "Raid Volume", 00:10:20.832 "block_size": 512, 00:10:20.832 "num_blocks": 126976, 00:10:20.832 "uuid": "c979e4e1-dc6e-4cbe-a0d3-98dae4c6aea6", 00:10:20.832 "assigned_rate_limits": { 00:10:20.832 "rw_ios_per_sec": 0, 00:10:20.832 "rw_mbytes_per_sec": 0, 00:10:20.832 "r_mbytes_per_sec": 0, 00:10:20.832 "w_mbytes_per_sec": 0 00:10:20.832 }, 00:10:20.832 "claimed": false, 00:10:20.832 "zoned": false, 00:10:20.832 "supported_io_types": { 00:10:20.832 "read": true, 00:10:20.832 "write": true, 00:10:20.832 "unmap": true, 00:10:20.832 "flush": true, 00:10:20.832 "reset": true, 00:10:20.832 "nvme_admin": false, 00:10:20.832 "nvme_io": false, 00:10:20.832 "nvme_io_md": false, 00:10:20.832 "write_zeroes": true, 00:10:20.832 "zcopy": false, 00:10:20.832 "get_zone_info": false, 00:10:20.832 "zone_management": false, 00:10:20.832 "zone_append": false, 00:10:20.832 "compare": false, 00:10:20.832 "compare_and_write": false, 00:10:20.832 "abort": false, 00:10:20.832 "seek_hole": false, 00:10:20.832 "seek_data": false, 00:10:20.832 "copy": false, 00:10:20.832 "nvme_iov_md": false 00:10:20.832 }, 00:10:20.832 "memory_domains": [ 00:10:20.832 { 00:10:20.832 "dma_device_id": "system", 00:10:20.832 "dma_device_type": 1 00:10:20.832 }, 00:10:20.832 { 00:10:20.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:20.832 "dma_device_type": 2 00:10:20.832 }, 00:10:20.832 { 00:10:20.832 "dma_device_id": "system", 00:10:20.832 "dma_device_type": 1 00:10:20.832 }, 00:10:20.832 { 00:10:20.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:20.832 "dma_device_type": 2 00:10:20.832 } 00:10:20.832 ], 00:10:20.832 "driver_specific": { 00:10:20.832 "raid": { 00:10:20.832 "uuid": "c979e4e1-dc6e-4cbe-a0d3-98dae4c6aea6", 00:10:20.832 "strip_size_kb": 64, 00:10:20.832 "state": "online", 00:10:20.832 "raid_level": "concat", 00:10:20.832 "superblock": true, 00:10:20.832 "num_base_bdevs": 2, 00:10:20.832 "num_base_bdevs_discovered": 2, 00:10:20.832 "num_base_bdevs_operational": 2, 00:10:20.832 "base_bdevs_list": [ 00:10:20.832 { 00:10:20.832 "name": "BaseBdev1", 00:10:20.832 "uuid": "4de1d366-0970-4897-801e-3bb41da3b758", 00:10:20.832 "is_configured": true, 00:10:20.832 "data_offset": 2048, 00:10:20.832 "data_size": 63488 00:10:20.832 }, 00:10:20.832 { 00:10:20.832 "name": "BaseBdev2", 00:10:20.832 "uuid": "2bc89f1f-d4cb-44d1-9095-5bac26c49775", 00:10:20.832 "is_configured": true, 00:10:20.832 "data_offset": 2048, 00:10:20.832 "data_size": 63488 00:10:20.832 } 00:10:20.832 ] 00:10:20.832 } 00:10:20.832 } 00:10:20.832 }' 00:10:20.832 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:20.832 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:20.832 BaseBdev2' 00:10:20.832 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:20.832 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:20.832 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:21.090 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:21.090 "name": "BaseBdev1", 00:10:21.090 "aliases": [ 00:10:21.090 "4de1d366-0970-4897-801e-3bb41da3b758" 00:10:21.090 ], 00:10:21.090 "product_name": "Malloc disk", 00:10:21.090 "block_size": 512, 00:10:21.090 "num_blocks": 65536, 00:10:21.090 "uuid": "4de1d366-0970-4897-801e-3bb41da3b758", 00:10:21.090 "assigned_rate_limits": { 00:10:21.090 "rw_ios_per_sec": 0, 00:10:21.090 "rw_mbytes_per_sec": 0, 00:10:21.090 "r_mbytes_per_sec": 0, 00:10:21.090 "w_mbytes_per_sec": 0 00:10:21.090 }, 00:10:21.090 "claimed": true, 00:10:21.090 "claim_type": "exclusive_write", 00:10:21.090 "zoned": false, 00:10:21.090 "supported_io_types": { 00:10:21.090 "read": true, 00:10:21.090 "write": true, 00:10:21.090 "unmap": true, 00:10:21.090 "flush": true, 00:10:21.090 "reset": true, 00:10:21.090 "nvme_admin": false, 00:10:21.090 "nvme_io": false, 00:10:21.090 "nvme_io_md": false, 00:10:21.090 "write_zeroes": true, 00:10:21.090 "zcopy": true, 00:10:21.090 "get_zone_info": false, 00:10:21.090 "zone_management": false, 00:10:21.090 "zone_append": false, 00:10:21.090 "compare": false, 00:10:21.090 "compare_and_write": false, 00:10:21.090 "abort": true, 00:10:21.090 "seek_hole": false, 00:10:21.090 "seek_data": false, 00:10:21.090 "copy": true, 00:10:21.090 "nvme_iov_md": false 00:10:21.090 }, 00:10:21.090 "memory_domains": [ 00:10:21.090 { 00:10:21.090 "dma_device_id": "system", 00:10:21.090 "dma_device_type": 1 00:10:21.090 }, 00:10:21.090 { 00:10:21.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.090 "dma_device_type": 2 00:10:21.090 } 00:10:21.090 ], 00:10:21.090 "driver_specific": {} 00:10:21.090 }' 00:10:21.090 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.090 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.090 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:21.090 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.090 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.090 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:21.348 13:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:21.605 "name": "BaseBdev2", 00:10:21.605 "aliases": [ 00:10:21.605 "2bc89f1f-d4cb-44d1-9095-5bac26c49775" 00:10:21.605 ], 00:10:21.605 "product_name": "Malloc disk", 00:10:21.605 "block_size": 512, 00:10:21.605 "num_blocks": 65536, 00:10:21.605 "uuid": "2bc89f1f-d4cb-44d1-9095-5bac26c49775", 00:10:21.605 "assigned_rate_limits": { 00:10:21.605 "rw_ios_per_sec": 0, 00:10:21.605 "rw_mbytes_per_sec": 0, 00:10:21.605 "r_mbytes_per_sec": 0, 00:10:21.605 "w_mbytes_per_sec": 0 00:10:21.605 }, 00:10:21.605 "claimed": true, 00:10:21.605 "claim_type": "exclusive_write", 00:10:21.605 "zoned": false, 00:10:21.605 "supported_io_types": { 00:10:21.605 "read": true, 00:10:21.605 "write": true, 00:10:21.605 "unmap": true, 00:10:21.605 "flush": true, 00:10:21.605 "reset": true, 00:10:21.605 "nvme_admin": false, 00:10:21.605 "nvme_io": false, 00:10:21.605 "nvme_io_md": false, 00:10:21.605 "write_zeroes": true, 00:10:21.605 "zcopy": true, 00:10:21.605 "get_zone_info": false, 00:10:21.605 "zone_management": false, 00:10:21.605 "zone_append": false, 00:10:21.605 "compare": false, 00:10:21.605 "compare_and_write": false, 00:10:21.605 "abort": true, 00:10:21.605 "seek_hole": false, 00:10:21.605 "seek_data": false, 00:10:21.605 "copy": true, 00:10:21.605 "nvme_iov_md": false 00:10:21.605 }, 00:10:21.605 "memory_domains": [ 00:10:21.605 { 00:10:21.605 "dma_device_id": "system", 00:10:21.605 "dma_device_type": 1 00:10:21.605 }, 00:10:21.605 { 00:10:21.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.605 "dma_device_type": 2 00:10:21.605 } 00:10:21.605 ], 00:10:21.605 "driver_specific": {} 00:10:21.605 }' 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.605 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.862 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:21.862 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.862 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.862 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:21.862 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:22.120 [2024-07-15 13:33:09.497060] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:22.120 [2024-07-15 13:33:09.497084] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:22.120 [2024-07-15 13:33:09.497115] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:22.120 "name": "Existed_Raid", 00:10:22.120 "uuid": "c979e4e1-dc6e-4cbe-a0d3-98dae4c6aea6", 00:10:22.120 "strip_size_kb": 64, 00:10:22.120 "state": "offline", 00:10:22.120 "raid_level": "concat", 00:10:22.120 "superblock": true, 00:10:22.120 "num_base_bdevs": 2, 00:10:22.120 "num_base_bdevs_discovered": 1, 00:10:22.120 "num_base_bdevs_operational": 1, 00:10:22.120 "base_bdevs_list": [ 00:10:22.120 { 00:10:22.120 "name": null, 00:10:22.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:22.120 "is_configured": false, 00:10:22.120 "data_offset": 2048, 00:10:22.120 "data_size": 63488 00:10:22.120 }, 00:10:22.120 { 00:10:22.120 "name": "BaseBdev2", 00:10:22.120 "uuid": "2bc89f1f-d4cb-44d1-9095-5bac26c49775", 00:10:22.120 "is_configured": true, 00:10:22.120 "data_offset": 2048, 00:10:22.120 "data_size": 63488 00:10:22.120 } 00:10:22.120 ] 00:10:22.120 }' 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:22.120 13:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:22.687 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:22.687 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:22.687 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:22.687 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:22.945 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:22.945 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:22.945 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:22.945 [2024-07-15 13:33:10.529377] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:22.945 [2024-07-15 13:33:10.529421] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc51610 name Existed_Raid, state offline 00:10:22.945 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:22.945 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:22.945 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:22.945 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4171588 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4171588 ']' 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4171588 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:23.202 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4171588 00:10:23.203 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:23.203 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:23.203 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4171588' 00:10:23.203 killing process with pid 4171588 00:10:23.203 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4171588 00:10:23.203 [2024-07-15 13:33:10.777945] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:23.203 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4171588 00:10:23.203 [2024-07-15 13:33:10.778858] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:23.460 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:23.460 00:10:23.460 real 0m8.281s 00:10:23.460 user 0m14.543s 00:10:23.460 sys 0m1.613s 00:10:23.460 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:23.460 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:23.460 ************************************ 00:10:23.460 END TEST raid_state_function_test_sb 00:10:23.460 ************************************ 00:10:23.460 13:33:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:23.460 13:33:11 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:10:23.460 13:33:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:23.460 13:33:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:23.460 13:33:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:23.460 ************************************ 00:10:23.460 START TEST raid_superblock_test 00:10:23.460 ************************************ 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:23.460 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4173258 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4173258 /var/tmp/spdk-raid.sock 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4173258 ']' 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:23.461 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:23.461 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:23.719 [2024-07-15 13:33:11.102889] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:23.719 [2024-07-15 13:33:11.102935] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4173258 ] 00:10:23.719 [2024-07-15 13:33:11.190591] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:23.719 [2024-07-15 13:33:11.281387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:23.977 [2024-07-15 13:33:11.341390] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:23.977 [2024-07-15 13:33:11.341419] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:24.543 13:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:24.543 malloc1 00:10:24.543 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:24.800 [2024-07-15 13:33:12.239484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:24.800 [2024-07-15 13:33:12.239522] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:24.800 [2024-07-15 13:33:12.239554] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe70260 00:10:24.800 [2024-07-15 13:33:12.239563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:24.800 [2024-07-15 13:33:12.240833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:24.800 [2024-07-15 13:33:12.240856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:24.800 pt1 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:24.800 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:24.800 malloc2 00:10:25.057 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:25.057 [2024-07-15 13:33:12.585519] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:25.057 [2024-07-15 13:33:12.585559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:25.057 [2024-07-15 13:33:12.585578] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101a310 00:10:25.057 [2024-07-15 13:33:12.585586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:25.057 [2024-07-15 13:33:12.586811] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:25.057 [2024-07-15 13:33:12.586833] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:25.057 pt2 00:10:25.057 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:25.057 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:25.057 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:10:25.314 [2024-07-15 13:33:12.762002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:25.314 [2024-07-15 13:33:12.763033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:25.314 [2024-07-15 13:33:12.763157] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10195b0 00:10:25.314 [2024-07-15 13:33:12.763166] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:25.314 [2024-07-15 13:33:12.763310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe6fb90 00:10:25.314 [2024-07-15 13:33:12.763416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10195b0 00:10:25.314 [2024-07-15 13:33:12.763423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10195b0 00:10:25.314 [2024-07-15 13:33:12.763497] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:25.314 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:25.571 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:25.571 "name": "raid_bdev1", 00:10:25.571 "uuid": "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6", 00:10:25.571 "strip_size_kb": 64, 00:10:25.571 "state": "online", 00:10:25.571 "raid_level": "concat", 00:10:25.571 "superblock": true, 00:10:25.571 "num_base_bdevs": 2, 00:10:25.571 "num_base_bdevs_discovered": 2, 00:10:25.571 "num_base_bdevs_operational": 2, 00:10:25.571 "base_bdevs_list": [ 00:10:25.571 { 00:10:25.571 "name": "pt1", 00:10:25.571 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:25.571 "is_configured": true, 00:10:25.571 "data_offset": 2048, 00:10:25.571 "data_size": 63488 00:10:25.571 }, 00:10:25.571 { 00:10:25.571 "name": "pt2", 00:10:25.571 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:25.571 "is_configured": true, 00:10:25.571 "data_offset": 2048, 00:10:25.571 "data_size": 63488 00:10:25.571 } 00:10:25.571 ] 00:10:25.571 }' 00:10:25.571 13:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:25.571 13:33:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:26.135 [2024-07-15 13:33:13.608333] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:26.135 "name": "raid_bdev1", 00:10:26.135 "aliases": [ 00:10:26.135 "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6" 00:10:26.135 ], 00:10:26.135 "product_name": "Raid Volume", 00:10:26.135 "block_size": 512, 00:10:26.135 "num_blocks": 126976, 00:10:26.135 "uuid": "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6", 00:10:26.135 "assigned_rate_limits": { 00:10:26.135 "rw_ios_per_sec": 0, 00:10:26.135 "rw_mbytes_per_sec": 0, 00:10:26.135 "r_mbytes_per_sec": 0, 00:10:26.135 "w_mbytes_per_sec": 0 00:10:26.135 }, 00:10:26.135 "claimed": false, 00:10:26.135 "zoned": false, 00:10:26.135 "supported_io_types": { 00:10:26.135 "read": true, 00:10:26.135 "write": true, 00:10:26.135 "unmap": true, 00:10:26.135 "flush": true, 00:10:26.135 "reset": true, 00:10:26.135 "nvme_admin": false, 00:10:26.135 "nvme_io": false, 00:10:26.135 "nvme_io_md": false, 00:10:26.135 "write_zeroes": true, 00:10:26.135 "zcopy": false, 00:10:26.135 "get_zone_info": false, 00:10:26.135 "zone_management": false, 00:10:26.135 "zone_append": false, 00:10:26.135 "compare": false, 00:10:26.135 "compare_and_write": false, 00:10:26.135 "abort": false, 00:10:26.135 "seek_hole": false, 00:10:26.135 "seek_data": false, 00:10:26.135 "copy": false, 00:10:26.135 "nvme_iov_md": false 00:10:26.135 }, 00:10:26.135 "memory_domains": [ 00:10:26.135 { 00:10:26.135 "dma_device_id": "system", 00:10:26.135 "dma_device_type": 1 00:10:26.135 }, 00:10:26.135 { 00:10:26.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.135 "dma_device_type": 2 00:10:26.135 }, 00:10:26.135 { 00:10:26.135 "dma_device_id": "system", 00:10:26.135 "dma_device_type": 1 00:10:26.135 }, 00:10:26.135 { 00:10:26.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.135 "dma_device_type": 2 00:10:26.135 } 00:10:26.135 ], 00:10:26.135 "driver_specific": { 00:10:26.135 "raid": { 00:10:26.135 "uuid": "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6", 00:10:26.135 "strip_size_kb": 64, 00:10:26.135 "state": "online", 00:10:26.135 "raid_level": "concat", 00:10:26.135 "superblock": true, 00:10:26.135 "num_base_bdevs": 2, 00:10:26.135 "num_base_bdevs_discovered": 2, 00:10:26.135 "num_base_bdevs_operational": 2, 00:10:26.135 "base_bdevs_list": [ 00:10:26.135 { 00:10:26.135 "name": "pt1", 00:10:26.135 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:26.135 "is_configured": true, 00:10:26.135 "data_offset": 2048, 00:10:26.135 "data_size": 63488 00:10:26.135 }, 00:10:26.135 { 00:10:26.135 "name": "pt2", 00:10:26.135 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:26.135 "is_configured": true, 00:10:26.135 "data_offset": 2048, 00:10:26.135 "data_size": 63488 00:10:26.135 } 00:10:26.135 ] 00:10:26.135 } 00:10:26.135 } 00:10:26.135 }' 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:26.135 pt2' 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:26.135 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:26.392 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:26.392 "name": "pt1", 00:10:26.392 "aliases": [ 00:10:26.392 "00000000-0000-0000-0000-000000000001" 00:10:26.392 ], 00:10:26.392 "product_name": "passthru", 00:10:26.392 "block_size": 512, 00:10:26.392 "num_blocks": 65536, 00:10:26.392 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:26.392 "assigned_rate_limits": { 00:10:26.392 "rw_ios_per_sec": 0, 00:10:26.392 "rw_mbytes_per_sec": 0, 00:10:26.392 "r_mbytes_per_sec": 0, 00:10:26.392 "w_mbytes_per_sec": 0 00:10:26.392 }, 00:10:26.392 "claimed": true, 00:10:26.392 "claim_type": "exclusive_write", 00:10:26.392 "zoned": false, 00:10:26.392 "supported_io_types": { 00:10:26.392 "read": true, 00:10:26.392 "write": true, 00:10:26.392 "unmap": true, 00:10:26.392 "flush": true, 00:10:26.392 "reset": true, 00:10:26.392 "nvme_admin": false, 00:10:26.392 "nvme_io": false, 00:10:26.392 "nvme_io_md": false, 00:10:26.392 "write_zeroes": true, 00:10:26.392 "zcopy": true, 00:10:26.392 "get_zone_info": false, 00:10:26.392 "zone_management": false, 00:10:26.392 "zone_append": false, 00:10:26.392 "compare": false, 00:10:26.392 "compare_and_write": false, 00:10:26.392 "abort": true, 00:10:26.392 "seek_hole": false, 00:10:26.392 "seek_data": false, 00:10:26.392 "copy": true, 00:10:26.392 "nvme_iov_md": false 00:10:26.392 }, 00:10:26.392 "memory_domains": [ 00:10:26.392 { 00:10:26.392 "dma_device_id": "system", 00:10:26.392 "dma_device_type": 1 00:10:26.392 }, 00:10:26.392 { 00:10:26.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.392 "dma_device_type": 2 00:10:26.392 } 00:10:26.392 ], 00:10:26.392 "driver_specific": { 00:10:26.392 "passthru": { 00:10:26.392 "name": "pt1", 00:10:26.392 "base_bdev_name": "malloc1" 00:10:26.392 } 00:10:26.392 } 00:10:26.392 }' 00:10:26.392 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.392 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.392 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:26.392 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.392 13:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:26.649 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:26.906 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:26.906 "name": "pt2", 00:10:26.906 "aliases": [ 00:10:26.906 "00000000-0000-0000-0000-000000000002" 00:10:26.906 ], 00:10:26.906 "product_name": "passthru", 00:10:26.906 "block_size": 512, 00:10:26.906 "num_blocks": 65536, 00:10:26.906 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:26.906 "assigned_rate_limits": { 00:10:26.906 "rw_ios_per_sec": 0, 00:10:26.906 "rw_mbytes_per_sec": 0, 00:10:26.906 "r_mbytes_per_sec": 0, 00:10:26.906 "w_mbytes_per_sec": 0 00:10:26.906 }, 00:10:26.906 "claimed": true, 00:10:26.906 "claim_type": "exclusive_write", 00:10:26.906 "zoned": false, 00:10:26.906 "supported_io_types": { 00:10:26.906 "read": true, 00:10:26.906 "write": true, 00:10:26.906 "unmap": true, 00:10:26.906 "flush": true, 00:10:26.906 "reset": true, 00:10:26.906 "nvme_admin": false, 00:10:26.906 "nvme_io": false, 00:10:26.906 "nvme_io_md": false, 00:10:26.906 "write_zeroes": true, 00:10:26.906 "zcopy": true, 00:10:26.906 "get_zone_info": false, 00:10:26.906 "zone_management": false, 00:10:26.906 "zone_append": false, 00:10:26.906 "compare": false, 00:10:26.906 "compare_and_write": false, 00:10:26.906 "abort": true, 00:10:26.906 "seek_hole": false, 00:10:26.906 "seek_data": false, 00:10:26.907 "copy": true, 00:10:26.907 "nvme_iov_md": false 00:10:26.907 }, 00:10:26.907 "memory_domains": [ 00:10:26.907 { 00:10:26.907 "dma_device_id": "system", 00:10:26.907 "dma_device_type": 1 00:10:26.907 }, 00:10:26.907 { 00:10:26.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.907 "dma_device_type": 2 00:10:26.907 } 00:10:26.907 ], 00:10:26.907 "driver_specific": { 00:10:26.907 "passthru": { 00:10:26.907 "name": "pt2", 00:10:26.907 "base_bdev_name": "malloc2" 00:10:26.907 } 00:10:26.907 } 00:10:26.907 }' 00:10:26.907 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.907 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.907 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:26.907 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.907 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.907 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:26.907 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.165 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.165 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:27.165 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.165 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.165 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:27.165 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:27.165 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:27.423 [2024-07-15 13:33:14.819434] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:27.423 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8a83fb6a-f0f5-40e5-969d-2364d3df9ce6 00:10:27.423 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8a83fb6a-f0f5-40e5-969d-2364d3df9ce6 ']' 00:10:27.423 13:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:27.423 [2024-07-15 13:33:14.999734] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:27.423 [2024-07-15 13:33:14.999752] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:27.423 [2024-07-15 13:33:14.999792] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:27.423 [2024-07-15 13:33:14.999824] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:27.423 [2024-07-15 13:33:14.999832] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10195b0 name raid_bdev1, state offline 00:10:27.423 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.423 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:27.680 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:27.680 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:27.680 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:27.680 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:27.938 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:27.938 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:27.938 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:27.938 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:28.196 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:28.196 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:28.196 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:28.197 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:28.455 [2024-07-15 13:33:15.877979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:28.455 [2024-07-15 13:33:15.878968] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:28.455 [2024-07-15 13:33:15.879020] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:28.455 [2024-07-15 13:33:15.879049] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:28.455 [2024-07-15 13:33:15.879077] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:28.455 [2024-07-15 13:33:15.879085] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101ad40 name raid_bdev1, state configuring 00:10:28.455 request: 00:10:28.455 { 00:10:28.455 "name": "raid_bdev1", 00:10:28.455 "raid_level": "concat", 00:10:28.455 "base_bdevs": [ 00:10:28.455 "malloc1", 00:10:28.455 "malloc2" 00:10:28.455 ], 00:10:28.455 "strip_size_kb": 64, 00:10:28.455 "superblock": false, 00:10:28.455 "method": "bdev_raid_create", 00:10:28.455 "req_id": 1 00:10:28.455 } 00:10:28.455 Got JSON-RPC error response 00:10:28.455 response: 00:10:28.455 { 00:10:28.455 "code": -17, 00:10:28.455 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:28.455 } 00:10:28.455 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:28.455 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:28.455 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:28.455 13:33:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:28.455 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.455 13:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:28.713 [2024-07-15 13:33:16.234885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:28.713 [2024-07-15 13:33:16.234923] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:28.713 [2024-07-15 13:33:16.234953] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe6f050 00:10:28.713 [2024-07-15 13:33:16.234961] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:28.713 [2024-07-15 13:33:16.236221] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:28.713 [2024-07-15 13:33:16.236244] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:28.713 [2024-07-15 13:33:16.236295] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:28.713 [2024-07-15 13:33:16.236316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:28.713 pt1 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.713 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:28.972 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:28.972 "name": "raid_bdev1", 00:10:28.972 "uuid": "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6", 00:10:28.972 "strip_size_kb": 64, 00:10:28.972 "state": "configuring", 00:10:28.972 "raid_level": "concat", 00:10:28.972 "superblock": true, 00:10:28.972 "num_base_bdevs": 2, 00:10:28.972 "num_base_bdevs_discovered": 1, 00:10:28.972 "num_base_bdevs_operational": 2, 00:10:28.972 "base_bdevs_list": [ 00:10:28.972 { 00:10:28.972 "name": "pt1", 00:10:28.972 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:28.972 "is_configured": true, 00:10:28.972 "data_offset": 2048, 00:10:28.972 "data_size": 63488 00:10:28.972 }, 00:10:28.972 { 00:10:28.972 "name": null, 00:10:28.972 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:28.972 "is_configured": false, 00:10:28.972 "data_offset": 2048, 00:10:28.972 "data_size": 63488 00:10:28.972 } 00:10:28.972 ] 00:10:28.972 }' 00:10:28.972 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:28.972 13:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.538 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:29.538 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:29.538 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:29.538 13:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:29.538 [2024-07-15 13:33:17.077081] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:29.538 [2024-07-15 13:33:17.077120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:29.538 [2024-07-15 13:33:17.077149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe70490 00:10:29.538 [2024-07-15 13:33:17.077157] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:29.538 [2024-07-15 13:33:17.077405] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:29.538 [2024-07-15 13:33:17.077417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:29.538 [2024-07-15 13:33:17.077465] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:29.538 [2024-07-15 13:33:17.077478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:29.538 [2024-07-15 13:33:17.077551] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe6f500 00:10:29.538 [2024-07-15 13:33:17.077558] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:29.538 [2024-07-15 13:33:17.077670] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x101e4d0 00:10:29.538 [2024-07-15 13:33:17.077752] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe6f500 00:10:29.538 [2024-07-15 13:33:17.077759] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe6f500 00:10:29.538 [2024-07-15 13:33:17.077826] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:29.538 pt2 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:29.538 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:29.796 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:29.796 "name": "raid_bdev1", 00:10:29.796 "uuid": "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6", 00:10:29.796 "strip_size_kb": 64, 00:10:29.796 "state": "online", 00:10:29.796 "raid_level": "concat", 00:10:29.796 "superblock": true, 00:10:29.796 "num_base_bdevs": 2, 00:10:29.796 "num_base_bdevs_discovered": 2, 00:10:29.796 "num_base_bdevs_operational": 2, 00:10:29.796 "base_bdevs_list": [ 00:10:29.796 { 00:10:29.796 "name": "pt1", 00:10:29.796 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:29.796 "is_configured": true, 00:10:29.796 "data_offset": 2048, 00:10:29.796 "data_size": 63488 00:10:29.796 }, 00:10:29.796 { 00:10:29.796 "name": "pt2", 00:10:29.796 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:29.796 "is_configured": true, 00:10:29.796 "data_offset": 2048, 00:10:29.796 "data_size": 63488 00:10:29.796 } 00:10:29.796 ] 00:10:29.796 }' 00:10:29.796 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:29.796 13:33:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:30.403 [2024-07-15 13:33:17.943478] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:30.403 "name": "raid_bdev1", 00:10:30.403 "aliases": [ 00:10:30.403 "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6" 00:10:30.403 ], 00:10:30.403 "product_name": "Raid Volume", 00:10:30.403 "block_size": 512, 00:10:30.403 "num_blocks": 126976, 00:10:30.403 "uuid": "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6", 00:10:30.403 "assigned_rate_limits": { 00:10:30.403 "rw_ios_per_sec": 0, 00:10:30.403 "rw_mbytes_per_sec": 0, 00:10:30.403 "r_mbytes_per_sec": 0, 00:10:30.403 "w_mbytes_per_sec": 0 00:10:30.403 }, 00:10:30.403 "claimed": false, 00:10:30.403 "zoned": false, 00:10:30.403 "supported_io_types": { 00:10:30.403 "read": true, 00:10:30.403 "write": true, 00:10:30.403 "unmap": true, 00:10:30.403 "flush": true, 00:10:30.403 "reset": true, 00:10:30.403 "nvme_admin": false, 00:10:30.403 "nvme_io": false, 00:10:30.403 "nvme_io_md": false, 00:10:30.403 "write_zeroes": true, 00:10:30.403 "zcopy": false, 00:10:30.403 "get_zone_info": false, 00:10:30.403 "zone_management": false, 00:10:30.403 "zone_append": false, 00:10:30.403 "compare": false, 00:10:30.403 "compare_and_write": false, 00:10:30.403 "abort": false, 00:10:30.403 "seek_hole": false, 00:10:30.403 "seek_data": false, 00:10:30.403 "copy": false, 00:10:30.403 "nvme_iov_md": false 00:10:30.403 }, 00:10:30.403 "memory_domains": [ 00:10:30.403 { 00:10:30.403 "dma_device_id": "system", 00:10:30.403 "dma_device_type": 1 00:10:30.403 }, 00:10:30.403 { 00:10:30.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:30.403 "dma_device_type": 2 00:10:30.403 }, 00:10:30.403 { 00:10:30.403 "dma_device_id": "system", 00:10:30.403 "dma_device_type": 1 00:10:30.403 }, 00:10:30.403 { 00:10:30.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:30.403 "dma_device_type": 2 00:10:30.403 } 00:10:30.403 ], 00:10:30.403 "driver_specific": { 00:10:30.403 "raid": { 00:10:30.403 "uuid": "8a83fb6a-f0f5-40e5-969d-2364d3df9ce6", 00:10:30.403 "strip_size_kb": 64, 00:10:30.403 "state": "online", 00:10:30.403 "raid_level": "concat", 00:10:30.403 "superblock": true, 00:10:30.403 "num_base_bdevs": 2, 00:10:30.403 "num_base_bdevs_discovered": 2, 00:10:30.403 "num_base_bdevs_operational": 2, 00:10:30.403 "base_bdevs_list": [ 00:10:30.403 { 00:10:30.403 "name": "pt1", 00:10:30.403 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:30.403 "is_configured": true, 00:10:30.403 "data_offset": 2048, 00:10:30.403 "data_size": 63488 00:10:30.403 }, 00:10:30.403 { 00:10:30.403 "name": "pt2", 00:10:30.403 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:30.403 "is_configured": true, 00:10:30.403 "data_offset": 2048, 00:10:30.403 "data_size": 63488 00:10:30.403 } 00:10:30.403 ] 00:10:30.403 } 00:10:30.403 } 00:10:30.403 }' 00:10:30.403 13:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:30.403 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:30.403 pt2' 00:10:30.403 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:30.403 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:30.403 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:30.662 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:30.662 "name": "pt1", 00:10:30.662 "aliases": [ 00:10:30.662 "00000000-0000-0000-0000-000000000001" 00:10:30.662 ], 00:10:30.662 "product_name": "passthru", 00:10:30.662 "block_size": 512, 00:10:30.662 "num_blocks": 65536, 00:10:30.662 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:30.662 "assigned_rate_limits": { 00:10:30.662 "rw_ios_per_sec": 0, 00:10:30.662 "rw_mbytes_per_sec": 0, 00:10:30.662 "r_mbytes_per_sec": 0, 00:10:30.662 "w_mbytes_per_sec": 0 00:10:30.662 }, 00:10:30.662 "claimed": true, 00:10:30.662 "claim_type": "exclusive_write", 00:10:30.662 "zoned": false, 00:10:30.662 "supported_io_types": { 00:10:30.662 "read": true, 00:10:30.662 "write": true, 00:10:30.662 "unmap": true, 00:10:30.662 "flush": true, 00:10:30.662 "reset": true, 00:10:30.662 "nvme_admin": false, 00:10:30.662 "nvme_io": false, 00:10:30.662 "nvme_io_md": false, 00:10:30.662 "write_zeroes": true, 00:10:30.662 "zcopy": true, 00:10:30.662 "get_zone_info": false, 00:10:30.662 "zone_management": false, 00:10:30.662 "zone_append": false, 00:10:30.662 "compare": false, 00:10:30.662 "compare_and_write": false, 00:10:30.662 "abort": true, 00:10:30.662 "seek_hole": false, 00:10:30.662 "seek_data": false, 00:10:30.662 "copy": true, 00:10:30.662 "nvme_iov_md": false 00:10:30.662 }, 00:10:30.662 "memory_domains": [ 00:10:30.662 { 00:10:30.662 "dma_device_id": "system", 00:10:30.662 "dma_device_type": 1 00:10:30.662 }, 00:10:30.662 { 00:10:30.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:30.662 "dma_device_type": 2 00:10:30.662 } 00:10:30.662 ], 00:10:30.662 "driver_specific": { 00:10:30.662 "passthru": { 00:10:30.662 "name": "pt1", 00:10:30.662 "base_bdev_name": "malloc1" 00:10:30.662 } 00:10:30.662 } 00:10:30.662 }' 00:10:30.662 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:30.662 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:30.662 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:30.662 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:30.920 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:31.178 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:31.178 "name": "pt2", 00:10:31.178 "aliases": [ 00:10:31.178 "00000000-0000-0000-0000-000000000002" 00:10:31.178 ], 00:10:31.178 "product_name": "passthru", 00:10:31.178 "block_size": 512, 00:10:31.178 "num_blocks": 65536, 00:10:31.178 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:31.178 "assigned_rate_limits": { 00:10:31.178 "rw_ios_per_sec": 0, 00:10:31.178 "rw_mbytes_per_sec": 0, 00:10:31.178 "r_mbytes_per_sec": 0, 00:10:31.178 "w_mbytes_per_sec": 0 00:10:31.178 }, 00:10:31.178 "claimed": true, 00:10:31.178 "claim_type": "exclusive_write", 00:10:31.178 "zoned": false, 00:10:31.178 "supported_io_types": { 00:10:31.178 "read": true, 00:10:31.178 "write": true, 00:10:31.178 "unmap": true, 00:10:31.178 "flush": true, 00:10:31.178 "reset": true, 00:10:31.178 "nvme_admin": false, 00:10:31.178 "nvme_io": false, 00:10:31.178 "nvme_io_md": false, 00:10:31.178 "write_zeroes": true, 00:10:31.178 "zcopy": true, 00:10:31.178 "get_zone_info": false, 00:10:31.178 "zone_management": false, 00:10:31.178 "zone_append": false, 00:10:31.178 "compare": false, 00:10:31.178 "compare_and_write": false, 00:10:31.178 "abort": true, 00:10:31.178 "seek_hole": false, 00:10:31.178 "seek_data": false, 00:10:31.178 "copy": true, 00:10:31.178 "nvme_iov_md": false 00:10:31.178 }, 00:10:31.178 "memory_domains": [ 00:10:31.178 { 00:10:31.178 "dma_device_id": "system", 00:10:31.178 "dma_device_type": 1 00:10:31.178 }, 00:10:31.178 { 00:10:31.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.178 "dma_device_type": 2 00:10:31.178 } 00:10:31.178 ], 00:10:31.178 "driver_specific": { 00:10:31.178 "passthru": { 00:10:31.178 "name": "pt2", 00:10:31.178 "base_bdev_name": "malloc2" 00:10:31.178 } 00:10:31.178 } 00:10:31.178 }' 00:10:31.178 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.178 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.178 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:31.178 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:31.178 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:31.437 13:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:31.695 [2024-07-15 13:33:19.106465] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8a83fb6a-f0f5-40e5-969d-2364d3df9ce6 '!=' 8a83fb6a-f0f5-40e5-969d-2364d3df9ce6 ']' 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4173258 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4173258 ']' 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4173258 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4173258 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4173258' 00:10:31.695 killing process with pid 4173258 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4173258 00:10:31.695 [2024-07-15 13:33:19.166576] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:31.695 [2024-07-15 13:33:19.166620] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:31.695 [2024-07-15 13:33:19.166651] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:31.695 [2024-07-15 13:33:19.166660] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe6f500 name raid_bdev1, state offline 00:10:31.695 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4173258 00:10:31.695 [2024-07-15 13:33:19.185053] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:31.954 13:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:31.954 00:10:31.954 real 0m8.335s 00:10:31.954 user 0m14.697s 00:10:31.954 sys 0m1.672s 00:10:31.954 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:31.954 13:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.954 ************************************ 00:10:31.954 END TEST raid_superblock_test 00:10:31.954 ************************************ 00:10:31.954 13:33:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:31.954 13:33:19 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:10:31.954 13:33:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:31.954 13:33:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:31.954 13:33:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:31.954 ************************************ 00:10:31.954 START TEST raid_read_error_test 00:10:31.954 ************************************ 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.zjlkaLgUFP 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4174566 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4174566 /var/tmp/spdk-raid.sock 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:31.954 13:33:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4174566 ']' 00:10:31.955 13:33:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:31.955 13:33:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:31.955 13:33:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:31.955 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:31.955 13:33:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:31.955 13:33:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.955 [2024-07-15 13:33:19.514972] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:31.955 [2024-07-15 13:33:19.515029] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4174566 ] 00:10:32.213 [2024-07-15 13:33:19.601654] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:32.213 [2024-07-15 13:33:19.692385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:32.213 [2024-07-15 13:33:19.754050] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:32.213 [2024-07-15 13:33:19.754082] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:32.779 13:33:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:32.779 13:33:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:32.779 13:33:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:32.779 13:33:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:33.038 BaseBdev1_malloc 00:10:33.038 13:33:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:33.297 true 00:10:33.297 13:33:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:33.297 [2024-07-15 13:33:20.848102] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:33.297 [2024-07-15 13:33:20.848141] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:33.297 [2024-07-15 13:33:20.848157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184d990 00:10:33.297 [2024-07-15 13:33:20.848166] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:33.297 [2024-07-15 13:33:20.849553] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:33.297 [2024-07-15 13:33:20.849578] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:33.297 BaseBdev1 00:10:33.297 13:33:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:33.297 13:33:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:33.555 BaseBdev2_malloc 00:10:33.555 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:33.813 true 00:10:33.813 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:33.813 [2024-07-15 13:33:21.366361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:33.813 [2024-07-15 13:33:21.366396] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:33.813 [2024-07-15 13:33:21.366427] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18521d0 00:10:33.813 [2024-07-15 13:33:21.366435] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:33.813 [2024-07-15 13:33:21.367569] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:33.813 [2024-07-15 13:33:21.367591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:33.813 BaseBdev2 00:10:33.813 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:34.071 [2024-07-15 13:33:21.542847] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:34.071 [2024-07-15 13:33:21.543815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:34.071 [2024-07-15 13:33:21.543952] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1853be0 00:10:34.071 [2024-07-15 13:33:21.543961] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:34.071 [2024-07-15 13:33:21.544109] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1854b50 00:10:34.071 [2024-07-15 13:33:21.544213] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1853be0 00:10:34.071 [2024-07-15 13:33:21.544219] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1853be0 00:10:34.071 [2024-07-15 13:33:21.544289] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:34.071 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:34.329 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:34.329 "name": "raid_bdev1", 00:10:34.330 "uuid": "2ec2e56d-45ca-4f28-9699-21fa076732eb", 00:10:34.330 "strip_size_kb": 64, 00:10:34.330 "state": "online", 00:10:34.330 "raid_level": "concat", 00:10:34.330 "superblock": true, 00:10:34.330 "num_base_bdevs": 2, 00:10:34.330 "num_base_bdevs_discovered": 2, 00:10:34.330 "num_base_bdevs_operational": 2, 00:10:34.330 "base_bdevs_list": [ 00:10:34.330 { 00:10:34.330 "name": "BaseBdev1", 00:10:34.330 "uuid": "a27803bc-b95a-5050-bd2f-969097cec245", 00:10:34.330 "is_configured": true, 00:10:34.330 "data_offset": 2048, 00:10:34.330 "data_size": 63488 00:10:34.330 }, 00:10:34.330 { 00:10:34.330 "name": "BaseBdev2", 00:10:34.330 "uuid": "bdc19f26-9728-50b7-aee8-9247839dcf93", 00:10:34.330 "is_configured": true, 00:10:34.330 "data_offset": 2048, 00:10:34.330 "data_size": 63488 00:10:34.330 } 00:10:34.330 ] 00:10:34.330 }' 00:10:34.330 13:33:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:34.330 13:33:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.895 13:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:34.895 13:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:34.895 [2024-07-15 13:33:22.333097] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x184f270 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.830 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:36.087 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:36.087 "name": "raid_bdev1", 00:10:36.087 "uuid": "2ec2e56d-45ca-4f28-9699-21fa076732eb", 00:10:36.087 "strip_size_kb": 64, 00:10:36.087 "state": "online", 00:10:36.087 "raid_level": "concat", 00:10:36.087 "superblock": true, 00:10:36.087 "num_base_bdevs": 2, 00:10:36.087 "num_base_bdevs_discovered": 2, 00:10:36.087 "num_base_bdevs_operational": 2, 00:10:36.087 "base_bdevs_list": [ 00:10:36.087 { 00:10:36.087 "name": "BaseBdev1", 00:10:36.087 "uuid": "a27803bc-b95a-5050-bd2f-969097cec245", 00:10:36.087 "is_configured": true, 00:10:36.087 "data_offset": 2048, 00:10:36.087 "data_size": 63488 00:10:36.087 }, 00:10:36.087 { 00:10:36.087 "name": "BaseBdev2", 00:10:36.087 "uuid": "bdc19f26-9728-50b7-aee8-9247839dcf93", 00:10:36.087 "is_configured": true, 00:10:36.087 "data_offset": 2048, 00:10:36.087 "data_size": 63488 00:10:36.087 } 00:10:36.087 ] 00:10:36.087 }' 00:10:36.087 13:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:36.087 13:33:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.653 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:36.911 [2024-07-15 13:33:24.285063] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:36.911 [2024-07-15 13:33:24.285093] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:36.911 [2024-07-15 13:33:24.287083] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.911 [2024-07-15 13:33:24.287104] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:36.911 [2024-07-15 13:33:24.287128] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:36.911 [2024-07-15 13:33:24.287135] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1853be0 name raid_bdev1, state offline 00:10:36.911 0 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4174566 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4174566 ']' 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4174566 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4174566 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4174566' 00:10:36.911 killing process with pid 4174566 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4174566 00:10:36.911 [2024-07-15 13:33:24.343804] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:36.911 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4174566 00:10:36.911 [2024-07-15 13:33:24.353572] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:37.168 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.zjlkaLgUFP 00:10:37.168 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:37.168 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:37.169 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:10:37.169 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:37.169 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:37.169 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:37.169 13:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:10:37.169 00:10:37.169 real 0m5.105s 00:10:37.169 user 0m7.687s 00:10:37.169 sys 0m0.915s 00:10:37.169 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.169 13:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:37.169 ************************************ 00:10:37.169 END TEST raid_read_error_test 00:10:37.169 ************************************ 00:10:37.169 13:33:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:37.169 13:33:24 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:10:37.169 13:33:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:37.169 13:33:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:37.169 13:33:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:37.169 ************************************ 00:10:37.169 START TEST raid_write_error_test 00:10:37.169 ************************************ 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CYy9VsuRT6 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4175347 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4175347 /var/tmp/spdk-raid.sock 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4175347 ']' 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:37.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:37.169 13:33:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:37.169 [2024-07-15 13:33:24.714350] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:37.169 [2024-07-15 13:33:24.714409] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4175347 ] 00:10:37.427 [2024-07-15 13:33:24.803139] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:37.427 [2024-07-15 13:33:24.889804] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.427 [2024-07-15 13:33:24.947171] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.427 [2024-07-15 13:33:24.947202] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.993 13:33:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:37.993 13:33:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:37.993 13:33:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:37.993 13:33:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:38.251 BaseBdev1_malloc 00:10:38.251 13:33:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:38.251 true 00:10:38.251 13:33:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:38.509 [2024-07-15 13:33:26.013763] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:38.509 [2024-07-15 13:33:26.013802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:38.509 [2024-07-15 13:33:26.013815] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d84990 00:10:38.509 [2024-07-15 13:33:26.013823] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:38.509 [2024-07-15 13:33:26.015021] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:38.509 [2024-07-15 13:33:26.015041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:38.509 BaseBdev1 00:10:38.509 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:38.509 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:38.768 BaseBdev2_malloc 00:10:38.768 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:38.768 true 00:10:39.026 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:39.026 [2024-07-15 13:33:26.550774] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:39.026 [2024-07-15 13:33:26.550810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:39.026 [2024-07-15 13:33:26.550840] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d891d0 00:10:39.026 [2024-07-15 13:33:26.550850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:39.026 [2024-07-15 13:33:26.551866] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:39.026 [2024-07-15 13:33:26.551887] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:39.026 BaseBdev2 00:10:39.026 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:39.285 [2024-07-15 13:33:26.723248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:39.285 [2024-07-15 13:33:26.724146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:39.285 [2024-07-15 13:33:26.724281] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d8abe0 00:10:39.285 [2024-07-15 13:33:26.724290] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:39.285 [2024-07-15 13:33:26.724423] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d8bb50 00:10:39.285 [2024-07-15 13:33:26.724523] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d8abe0 00:10:39.285 [2024-07-15 13:33:26.724529] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d8abe0 00:10:39.285 [2024-07-15 13:33:26.724597] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:39.285 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:39.544 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:39.544 "name": "raid_bdev1", 00:10:39.544 "uuid": "635438e5-51db-4521-bf5d-79e712a9e06f", 00:10:39.544 "strip_size_kb": 64, 00:10:39.544 "state": "online", 00:10:39.544 "raid_level": "concat", 00:10:39.544 "superblock": true, 00:10:39.544 "num_base_bdevs": 2, 00:10:39.544 "num_base_bdevs_discovered": 2, 00:10:39.544 "num_base_bdevs_operational": 2, 00:10:39.544 "base_bdevs_list": [ 00:10:39.544 { 00:10:39.544 "name": "BaseBdev1", 00:10:39.544 "uuid": "9f5e7c88-522e-5050-bcff-e70726a363ca", 00:10:39.544 "is_configured": true, 00:10:39.544 "data_offset": 2048, 00:10:39.544 "data_size": 63488 00:10:39.544 }, 00:10:39.544 { 00:10:39.544 "name": "BaseBdev2", 00:10:39.544 "uuid": "080ebe61-47db-5ffe-a22d-c0299a392da1", 00:10:39.544 "is_configured": true, 00:10:39.544 "data_offset": 2048, 00:10:39.544 "data_size": 63488 00:10:39.544 } 00:10:39.544 ] 00:10:39.544 }' 00:10:39.544 13:33:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:39.544 13:33:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.802 13:33:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:39.802 13:33:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:40.140 [2024-07-15 13:33:27.489470] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d86270 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.096 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:41.353 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:41.353 "name": "raid_bdev1", 00:10:41.353 "uuid": "635438e5-51db-4521-bf5d-79e712a9e06f", 00:10:41.353 "strip_size_kb": 64, 00:10:41.353 "state": "online", 00:10:41.353 "raid_level": "concat", 00:10:41.353 "superblock": true, 00:10:41.353 "num_base_bdevs": 2, 00:10:41.353 "num_base_bdevs_discovered": 2, 00:10:41.354 "num_base_bdevs_operational": 2, 00:10:41.354 "base_bdevs_list": [ 00:10:41.354 { 00:10:41.354 "name": "BaseBdev1", 00:10:41.354 "uuid": "9f5e7c88-522e-5050-bcff-e70726a363ca", 00:10:41.354 "is_configured": true, 00:10:41.354 "data_offset": 2048, 00:10:41.354 "data_size": 63488 00:10:41.354 }, 00:10:41.354 { 00:10:41.354 "name": "BaseBdev2", 00:10:41.354 "uuid": "080ebe61-47db-5ffe-a22d-c0299a392da1", 00:10:41.354 "is_configured": true, 00:10:41.354 "data_offset": 2048, 00:10:41.354 "data_size": 63488 00:10:41.354 } 00:10:41.354 ] 00:10:41.354 }' 00:10:41.354 13:33:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:41.354 13:33:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:41.919 [2024-07-15 13:33:29.422167] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:41.919 [2024-07-15 13:33:29.422194] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:41.919 [2024-07-15 13:33:29.424368] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:41.919 [2024-07-15 13:33:29.424391] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:41.919 [2024-07-15 13:33:29.424411] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:41.919 [2024-07-15 13:33:29.424419] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d8abe0 name raid_bdev1, state offline 00:10:41.919 0 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4175347 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4175347 ']' 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4175347 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4175347 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4175347' 00:10:41.919 killing process with pid 4175347 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4175347 00:10:41.919 [2024-07-15 13:33:29.489810] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:41.919 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4175347 00:10:41.919 [2024-07-15 13:33:29.499972] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CYy9VsuRT6 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:10:42.177 00:10:42.177 real 0m5.075s 00:10:42.177 user 0m7.620s 00:10:42.177 sys 0m0.901s 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:42.177 13:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:42.177 ************************************ 00:10:42.177 END TEST raid_write_error_test 00:10:42.177 ************************************ 00:10:42.177 13:33:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:42.177 13:33:29 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:42.177 13:33:29 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:10:42.177 13:33:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:42.177 13:33:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:42.177 13:33:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:42.436 ************************************ 00:10:42.436 START TEST raid_state_function_test 00:10:42.436 ************************************ 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:42.436 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4176149 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4176149' 00:10:42.437 Process raid pid: 4176149 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4176149 /var/tmp/spdk-raid.sock 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4176149 ']' 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:42.437 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:42.437 13:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:42.437 [2024-07-15 13:33:29.869125] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:42.437 [2024-07-15 13:33:29.869183] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:42.437 [2024-07-15 13:33:29.957258] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:42.437 [2024-07-15 13:33:30.054084] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:42.694 [2024-07-15 13:33:30.115015] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:42.694 [2024-07-15 13:33:30.115046] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:43.259 [2024-07-15 13:33:30.805317] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:43.259 [2024-07-15 13:33:30.805351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:43.259 [2024-07-15 13:33:30.805359] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:43.259 [2024-07-15 13:33:30.805367] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.259 13:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:43.515 13:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:43.515 "name": "Existed_Raid", 00:10:43.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.515 "strip_size_kb": 0, 00:10:43.515 "state": "configuring", 00:10:43.515 "raid_level": "raid1", 00:10:43.515 "superblock": false, 00:10:43.515 "num_base_bdevs": 2, 00:10:43.515 "num_base_bdevs_discovered": 0, 00:10:43.515 "num_base_bdevs_operational": 2, 00:10:43.515 "base_bdevs_list": [ 00:10:43.515 { 00:10:43.515 "name": "BaseBdev1", 00:10:43.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.515 "is_configured": false, 00:10:43.515 "data_offset": 0, 00:10:43.515 "data_size": 0 00:10:43.515 }, 00:10:43.515 { 00:10:43.515 "name": "BaseBdev2", 00:10:43.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.515 "is_configured": false, 00:10:43.515 "data_offset": 0, 00:10:43.515 "data_size": 0 00:10:43.515 } 00:10:43.515 ] 00:10:43.515 }' 00:10:43.515 13:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:43.515 13:33:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:44.079 13:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:44.079 [2024-07-15 13:33:31.651441] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:44.079 [2024-07-15 13:33:31.651475] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x256cf30 name Existed_Raid, state configuring 00:10:44.079 13:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:44.336 [2024-07-15 13:33:31.827895] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:44.336 [2024-07-15 13:33:31.827930] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:44.336 [2024-07-15 13:33:31.827937] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:44.336 [2024-07-15 13:33:31.827944] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:44.336 13:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:44.598 [2024-07-15 13:33:32.014223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:44.598 BaseBdev1 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:44.598 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:44.855 [ 00:10:44.855 { 00:10:44.855 "name": "BaseBdev1", 00:10:44.855 "aliases": [ 00:10:44.855 "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27" 00:10:44.855 ], 00:10:44.855 "product_name": "Malloc disk", 00:10:44.855 "block_size": 512, 00:10:44.855 "num_blocks": 65536, 00:10:44.855 "uuid": "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27", 00:10:44.855 "assigned_rate_limits": { 00:10:44.855 "rw_ios_per_sec": 0, 00:10:44.855 "rw_mbytes_per_sec": 0, 00:10:44.855 "r_mbytes_per_sec": 0, 00:10:44.855 "w_mbytes_per_sec": 0 00:10:44.855 }, 00:10:44.855 "claimed": true, 00:10:44.855 "claim_type": "exclusive_write", 00:10:44.855 "zoned": false, 00:10:44.855 "supported_io_types": { 00:10:44.855 "read": true, 00:10:44.855 "write": true, 00:10:44.855 "unmap": true, 00:10:44.855 "flush": true, 00:10:44.855 "reset": true, 00:10:44.855 "nvme_admin": false, 00:10:44.855 "nvme_io": false, 00:10:44.855 "nvme_io_md": false, 00:10:44.855 "write_zeroes": true, 00:10:44.855 "zcopy": true, 00:10:44.855 "get_zone_info": false, 00:10:44.855 "zone_management": false, 00:10:44.855 "zone_append": false, 00:10:44.855 "compare": false, 00:10:44.855 "compare_and_write": false, 00:10:44.855 "abort": true, 00:10:44.855 "seek_hole": false, 00:10:44.855 "seek_data": false, 00:10:44.855 "copy": true, 00:10:44.855 "nvme_iov_md": false 00:10:44.855 }, 00:10:44.855 "memory_domains": [ 00:10:44.855 { 00:10:44.855 "dma_device_id": "system", 00:10:44.855 "dma_device_type": 1 00:10:44.855 }, 00:10:44.855 { 00:10:44.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.855 "dma_device_type": 2 00:10:44.855 } 00:10:44.855 ], 00:10:44.855 "driver_specific": {} 00:10:44.855 } 00:10:44.855 ] 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.855 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:45.112 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.112 "name": "Existed_Raid", 00:10:45.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.112 "strip_size_kb": 0, 00:10:45.112 "state": "configuring", 00:10:45.112 "raid_level": "raid1", 00:10:45.112 "superblock": false, 00:10:45.112 "num_base_bdevs": 2, 00:10:45.112 "num_base_bdevs_discovered": 1, 00:10:45.112 "num_base_bdevs_operational": 2, 00:10:45.112 "base_bdevs_list": [ 00:10:45.112 { 00:10:45.112 "name": "BaseBdev1", 00:10:45.112 "uuid": "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27", 00:10:45.112 "is_configured": true, 00:10:45.112 "data_offset": 0, 00:10:45.112 "data_size": 65536 00:10:45.112 }, 00:10:45.112 { 00:10:45.112 "name": "BaseBdev2", 00:10:45.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.112 "is_configured": false, 00:10:45.112 "data_offset": 0, 00:10:45.112 "data_size": 0 00:10:45.112 } 00:10:45.112 ] 00:10:45.112 }' 00:10:45.112 13:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.112 13:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:45.677 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:45.677 [2024-07-15 13:33:33.233389] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:45.677 [2024-07-15 13:33:33.233432] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x256c820 name Existed_Raid, state configuring 00:10:45.677 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:45.936 [2024-07-15 13:33:33.401843] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:45.936 [2024-07-15 13:33:33.402964] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:45.936 [2024-07-15 13:33:33.402993] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.936 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:46.195 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:46.195 "name": "Existed_Raid", 00:10:46.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.195 "strip_size_kb": 0, 00:10:46.195 "state": "configuring", 00:10:46.195 "raid_level": "raid1", 00:10:46.195 "superblock": false, 00:10:46.195 "num_base_bdevs": 2, 00:10:46.195 "num_base_bdevs_discovered": 1, 00:10:46.195 "num_base_bdevs_operational": 2, 00:10:46.195 "base_bdevs_list": [ 00:10:46.195 { 00:10:46.195 "name": "BaseBdev1", 00:10:46.195 "uuid": "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27", 00:10:46.195 "is_configured": true, 00:10:46.195 "data_offset": 0, 00:10:46.195 "data_size": 65536 00:10:46.195 }, 00:10:46.195 { 00:10:46.195 "name": "BaseBdev2", 00:10:46.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.195 "is_configured": false, 00:10:46.195 "data_offset": 0, 00:10:46.195 "data_size": 0 00:10:46.195 } 00:10:46.195 ] 00:10:46.195 }' 00:10:46.195 13:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:46.195 13:33:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.453 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:46.712 [2024-07-15 13:33:34.230900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:46.712 [2024-07-15 13:33:34.230933] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x256d610 00:10:46.712 [2024-07-15 13:33:34.230940] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:10:46.712 [2024-07-15 13:33:34.231101] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27110d0 00:10:46.712 [2024-07-15 13:33:34.231193] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x256d610 00:10:46.712 [2024-07-15 13:33:34.231200] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x256d610 00:10:46.712 [2024-07-15 13:33:34.231329] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:46.712 BaseBdev2 00:10:46.712 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:46.712 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:46.712 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:46.712 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:46.712 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:46.712 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:46.712 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:46.970 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:47.227 [ 00:10:47.227 { 00:10:47.227 "name": "BaseBdev2", 00:10:47.227 "aliases": [ 00:10:47.227 "69de9c8d-93eb-4ec0-adeb-302586100886" 00:10:47.228 ], 00:10:47.228 "product_name": "Malloc disk", 00:10:47.228 "block_size": 512, 00:10:47.228 "num_blocks": 65536, 00:10:47.228 "uuid": "69de9c8d-93eb-4ec0-adeb-302586100886", 00:10:47.228 "assigned_rate_limits": { 00:10:47.228 "rw_ios_per_sec": 0, 00:10:47.228 "rw_mbytes_per_sec": 0, 00:10:47.228 "r_mbytes_per_sec": 0, 00:10:47.228 "w_mbytes_per_sec": 0 00:10:47.228 }, 00:10:47.228 "claimed": true, 00:10:47.228 "claim_type": "exclusive_write", 00:10:47.228 "zoned": false, 00:10:47.228 "supported_io_types": { 00:10:47.228 "read": true, 00:10:47.228 "write": true, 00:10:47.228 "unmap": true, 00:10:47.228 "flush": true, 00:10:47.228 "reset": true, 00:10:47.228 "nvme_admin": false, 00:10:47.228 "nvme_io": false, 00:10:47.228 "nvme_io_md": false, 00:10:47.228 "write_zeroes": true, 00:10:47.228 "zcopy": true, 00:10:47.228 "get_zone_info": false, 00:10:47.228 "zone_management": false, 00:10:47.228 "zone_append": false, 00:10:47.228 "compare": false, 00:10:47.228 "compare_and_write": false, 00:10:47.228 "abort": true, 00:10:47.228 "seek_hole": false, 00:10:47.228 "seek_data": false, 00:10:47.228 "copy": true, 00:10:47.228 "nvme_iov_md": false 00:10:47.228 }, 00:10:47.228 "memory_domains": [ 00:10:47.228 { 00:10:47.228 "dma_device_id": "system", 00:10:47.228 "dma_device_type": 1 00:10:47.228 }, 00:10:47.228 { 00:10:47.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.228 "dma_device_type": 2 00:10:47.228 } 00:10:47.228 ], 00:10:47.228 "driver_specific": {} 00:10:47.228 } 00:10:47.228 ] 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:47.228 "name": "Existed_Raid", 00:10:47.228 "uuid": "170d76cf-4539-42d8-833d-583a0f056523", 00:10:47.228 "strip_size_kb": 0, 00:10:47.228 "state": "online", 00:10:47.228 "raid_level": "raid1", 00:10:47.228 "superblock": false, 00:10:47.228 "num_base_bdevs": 2, 00:10:47.228 "num_base_bdevs_discovered": 2, 00:10:47.228 "num_base_bdevs_operational": 2, 00:10:47.228 "base_bdevs_list": [ 00:10:47.228 { 00:10:47.228 "name": "BaseBdev1", 00:10:47.228 "uuid": "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27", 00:10:47.228 "is_configured": true, 00:10:47.228 "data_offset": 0, 00:10:47.228 "data_size": 65536 00:10:47.228 }, 00:10:47.228 { 00:10:47.228 "name": "BaseBdev2", 00:10:47.228 "uuid": "69de9c8d-93eb-4ec0-adeb-302586100886", 00:10:47.228 "is_configured": true, 00:10:47.228 "data_offset": 0, 00:10:47.228 "data_size": 65536 00:10:47.228 } 00:10:47.228 ] 00:10:47.228 }' 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:47.228 13:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:47.795 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:48.053 [2024-07-15 13:33:35.422313] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:48.053 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:48.053 "name": "Existed_Raid", 00:10:48.053 "aliases": [ 00:10:48.053 "170d76cf-4539-42d8-833d-583a0f056523" 00:10:48.053 ], 00:10:48.053 "product_name": "Raid Volume", 00:10:48.053 "block_size": 512, 00:10:48.053 "num_blocks": 65536, 00:10:48.053 "uuid": "170d76cf-4539-42d8-833d-583a0f056523", 00:10:48.053 "assigned_rate_limits": { 00:10:48.053 "rw_ios_per_sec": 0, 00:10:48.053 "rw_mbytes_per_sec": 0, 00:10:48.053 "r_mbytes_per_sec": 0, 00:10:48.053 "w_mbytes_per_sec": 0 00:10:48.053 }, 00:10:48.053 "claimed": false, 00:10:48.053 "zoned": false, 00:10:48.053 "supported_io_types": { 00:10:48.053 "read": true, 00:10:48.053 "write": true, 00:10:48.053 "unmap": false, 00:10:48.053 "flush": false, 00:10:48.053 "reset": true, 00:10:48.053 "nvme_admin": false, 00:10:48.053 "nvme_io": false, 00:10:48.053 "nvme_io_md": false, 00:10:48.053 "write_zeroes": true, 00:10:48.053 "zcopy": false, 00:10:48.053 "get_zone_info": false, 00:10:48.053 "zone_management": false, 00:10:48.053 "zone_append": false, 00:10:48.053 "compare": false, 00:10:48.053 "compare_and_write": false, 00:10:48.053 "abort": false, 00:10:48.053 "seek_hole": false, 00:10:48.053 "seek_data": false, 00:10:48.053 "copy": false, 00:10:48.053 "nvme_iov_md": false 00:10:48.053 }, 00:10:48.053 "memory_domains": [ 00:10:48.053 { 00:10:48.053 "dma_device_id": "system", 00:10:48.053 "dma_device_type": 1 00:10:48.053 }, 00:10:48.053 { 00:10:48.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.053 "dma_device_type": 2 00:10:48.053 }, 00:10:48.053 { 00:10:48.053 "dma_device_id": "system", 00:10:48.053 "dma_device_type": 1 00:10:48.053 }, 00:10:48.053 { 00:10:48.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.053 "dma_device_type": 2 00:10:48.053 } 00:10:48.053 ], 00:10:48.053 "driver_specific": { 00:10:48.053 "raid": { 00:10:48.053 "uuid": "170d76cf-4539-42d8-833d-583a0f056523", 00:10:48.053 "strip_size_kb": 0, 00:10:48.053 "state": "online", 00:10:48.053 "raid_level": "raid1", 00:10:48.053 "superblock": false, 00:10:48.053 "num_base_bdevs": 2, 00:10:48.053 "num_base_bdevs_discovered": 2, 00:10:48.053 "num_base_bdevs_operational": 2, 00:10:48.053 "base_bdevs_list": [ 00:10:48.053 { 00:10:48.053 "name": "BaseBdev1", 00:10:48.053 "uuid": "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27", 00:10:48.053 "is_configured": true, 00:10:48.053 "data_offset": 0, 00:10:48.053 "data_size": 65536 00:10:48.053 }, 00:10:48.053 { 00:10:48.053 "name": "BaseBdev2", 00:10:48.053 "uuid": "69de9c8d-93eb-4ec0-adeb-302586100886", 00:10:48.053 "is_configured": true, 00:10:48.053 "data_offset": 0, 00:10:48.053 "data_size": 65536 00:10:48.053 } 00:10:48.053 ] 00:10:48.053 } 00:10:48.053 } 00:10:48.053 }' 00:10:48.053 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:48.053 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:48.053 BaseBdev2' 00:10:48.053 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:48.053 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:48.053 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:48.053 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:48.053 "name": "BaseBdev1", 00:10:48.053 "aliases": [ 00:10:48.053 "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27" 00:10:48.053 ], 00:10:48.053 "product_name": "Malloc disk", 00:10:48.053 "block_size": 512, 00:10:48.053 "num_blocks": 65536, 00:10:48.053 "uuid": "a410b7bb-a475-4cab-9d30-2f1e3fcd2a27", 00:10:48.053 "assigned_rate_limits": { 00:10:48.053 "rw_ios_per_sec": 0, 00:10:48.053 "rw_mbytes_per_sec": 0, 00:10:48.053 "r_mbytes_per_sec": 0, 00:10:48.053 "w_mbytes_per_sec": 0 00:10:48.053 }, 00:10:48.053 "claimed": true, 00:10:48.053 "claim_type": "exclusive_write", 00:10:48.053 "zoned": false, 00:10:48.053 "supported_io_types": { 00:10:48.053 "read": true, 00:10:48.053 "write": true, 00:10:48.053 "unmap": true, 00:10:48.053 "flush": true, 00:10:48.053 "reset": true, 00:10:48.053 "nvme_admin": false, 00:10:48.053 "nvme_io": false, 00:10:48.053 "nvme_io_md": false, 00:10:48.053 "write_zeroes": true, 00:10:48.053 "zcopy": true, 00:10:48.053 "get_zone_info": false, 00:10:48.053 "zone_management": false, 00:10:48.053 "zone_append": false, 00:10:48.053 "compare": false, 00:10:48.053 "compare_and_write": false, 00:10:48.053 "abort": true, 00:10:48.053 "seek_hole": false, 00:10:48.053 "seek_data": false, 00:10:48.053 "copy": true, 00:10:48.053 "nvme_iov_md": false 00:10:48.053 }, 00:10:48.053 "memory_domains": [ 00:10:48.053 { 00:10:48.053 "dma_device_id": "system", 00:10:48.054 "dma_device_type": 1 00:10:48.054 }, 00:10:48.054 { 00:10:48.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.054 "dma_device_type": 2 00:10:48.054 } 00:10:48.054 ], 00:10:48.054 "driver_specific": {} 00:10:48.054 }' 00:10:48.054 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:48.311 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:48.569 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:48.569 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:48.569 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:48.569 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:48.569 13:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:48.569 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:48.570 "name": "BaseBdev2", 00:10:48.570 "aliases": [ 00:10:48.570 "69de9c8d-93eb-4ec0-adeb-302586100886" 00:10:48.570 ], 00:10:48.570 "product_name": "Malloc disk", 00:10:48.570 "block_size": 512, 00:10:48.570 "num_blocks": 65536, 00:10:48.570 "uuid": "69de9c8d-93eb-4ec0-adeb-302586100886", 00:10:48.570 "assigned_rate_limits": { 00:10:48.570 "rw_ios_per_sec": 0, 00:10:48.570 "rw_mbytes_per_sec": 0, 00:10:48.570 "r_mbytes_per_sec": 0, 00:10:48.570 "w_mbytes_per_sec": 0 00:10:48.570 }, 00:10:48.570 "claimed": true, 00:10:48.570 "claim_type": "exclusive_write", 00:10:48.570 "zoned": false, 00:10:48.570 "supported_io_types": { 00:10:48.570 "read": true, 00:10:48.570 "write": true, 00:10:48.570 "unmap": true, 00:10:48.570 "flush": true, 00:10:48.570 "reset": true, 00:10:48.570 "nvme_admin": false, 00:10:48.570 "nvme_io": false, 00:10:48.570 "nvme_io_md": false, 00:10:48.570 "write_zeroes": true, 00:10:48.570 "zcopy": true, 00:10:48.570 "get_zone_info": false, 00:10:48.570 "zone_management": false, 00:10:48.570 "zone_append": false, 00:10:48.570 "compare": false, 00:10:48.570 "compare_and_write": false, 00:10:48.570 "abort": true, 00:10:48.570 "seek_hole": false, 00:10:48.570 "seek_data": false, 00:10:48.570 "copy": true, 00:10:48.570 "nvme_iov_md": false 00:10:48.570 }, 00:10:48.570 "memory_domains": [ 00:10:48.570 { 00:10:48.570 "dma_device_id": "system", 00:10:48.570 "dma_device_type": 1 00:10:48.570 }, 00:10:48.570 { 00:10:48.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.570 "dma_device_type": 2 00:10:48.570 } 00:10:48.570 ], 00:10:48.570 "driver_specific": {} 00:10:48.570 }' 00:10:48.570 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:48.828 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:49.086 [2024-07-15 13:33:36.621254] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:49.086 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:49.344 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:49.344 "name": "Existed_Raid", 00:10:49.344 "uuid": "170d76cf-4539-42d8-833d-583a0f056523", 00:10:49.344 "strip_size_kb": 0, 00:10:49.344 "state": "online", 00:10:49.344 "raid_level": "raid1", 00:10:49.344 "superblock": false, 00:10:49.344 "num_base_bdevs": 2, 00:10:49.344 "num_base_bdevs_discovered": 1, 00:10:49.344 "num_base_bdevs_operational": 1, 00:10:49.344 "base_bdevs_list": [ 00:10:49.344 { 00:10:49.345 "name": null, 00:10:49.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:49.345 "is_configured": false, 00:10:49.345 "data_offset": 0, 00:10:49.345 "data_size": 65536 00:10:49.345 }, 00:10:49.345 { 00:10:49.345 "name": "BaseBdev2", 00:10:49.345 "uuid": "69de9c8d-93eb-4ec0-adeb-302586100886", 00:10:49.345 "is_configured": true, 00:10:49.345 "data_offset": 0, 00:10:49.345 "data_size": 65536 00:10:49.345 } 00:10:49.345 ] 00:10:49.345 }' 00:10:49.345 13:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:49.345 13:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.909 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:49.909 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:49.909 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:49.909 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:49.909 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:49.909 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:49.909 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:50.166 [2024-07-15 13:33:37.656718] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:50.166 [2024-07-15 13:33:37.656784] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:50.166 [2024-07-15 13:33:37.668740] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:50.166 [2024-07-15 13:33:37.668784] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:50.166 [2024-07-15 13:33:37.668792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x256d610 name Existed_Raid, state offline 00:10:50.166 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:50.166 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:50.166 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.166 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4176149 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4176149 ']' 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4176149 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4176149 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4176149' 00:10:50.423 killing process with pid 4176149 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4176149 00:10:50.423 [2024-07-15 13:33:37.912250] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:50.423 13:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4176149 00:10:50.423 [2024-07-15 13:33:37.913051] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:50.681 13:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:50.681 00:10:50.681 real 0m8.293s 00:10:50.681 user 0m14.595s 00:10:50.681 sys 0m1.614s 00:10:50.681 13:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:50.681 13:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:50.681 ************************************ 00:10:50.681 END TEST raid_state_function_test 00:10:50.681 ************************************ 00:10:50.681 13:33:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:50.681 13:33:38 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:10:50.681 13:33:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:50.681 13:33:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:50.681 13:33:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:50.681 ************************************ 00:10:50.681 START TEST raid_state_function_test_sb 00:10:50.681 ************************************ 00:10:50.681 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:10:50.681 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:50.681 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:50.681 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4177445 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4177445' 00:10:50.682 Process raid pid: 4177445 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4177445 /var/tmp/spdk-raid.sock 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4177445 ']' 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:50.682 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:50.682 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:50.682 [2024-07-15 13:33:38.225656] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:50.682 [2024-07-15 13:33:38.225707] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:50.939 [2024-07-15 13:33:38.314967] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:50.939 [2024-07-15 13:33:38.406285] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:50.939 [2024-07-15 13:33:38.468515] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:50.939 [2024-07-15 13:33:38.468541] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:51.502 13:33:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:51.502 13:33:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:51.502 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:51.759 [2024-07-15 13:33:39.188824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:51.759 [2024-07-15 13:33:39.188857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:51.759 [2024-07-15 13:33:39.188864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:51.759 [2024-07-15 13:33:39.188887] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.759 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:52.016 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:52.016 "name": "Existed_Raid", 00:10:52.016 "uuid": "bf7691a4-eb6f-4c65-a442-c9eea42614df", 00:10:52.016 "strip_size_kb": 0, 00:10:52.016 "state": "configuring", 00:10:52.016 "raid_level": "raid1", 00:10:52.016 "superblock": true, 00:10:52.016 "num_base_bdevs": 2, 00:10:52.016 "num_base_bdevs_discovered": 0, 00:10:52.016 "num_base_bdevs_operational": 2, 00:10:52.016 "base_bdevs_list": [ 00:10:52.016 { 00:10:52.016 "name": "BaseBdev1", 00:10:52.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:52.016 "is_configured": false, 00:10:52.016 "data_offset": 0, 00:10:52.016 "data_size": 0 00:10:52.016 }, 00:10:52.016 { 00:10:52.016 "name": "BaseBdev2", 00:10:52.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:52.016 "is_configured": false, 00:10:52.016 "data_offset": 0, 00:10:52.016 "data_size": 0 00:10:52.016 } 00:10:52.016 ] 00:10:52.016 }' 00:10:52.016 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:52.016 13:33:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:52.272 13:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:52.545 [2024-07-15 13:33:40.010845] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:52.545 [2024-07-15 13:33:40.010873] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x199cf30 name Existed_Raid, state configuring 00:10:52.545 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:52.802 [2024-07-15 13:33:40.191342] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:52.802 [2024-07-15 13:33:40.191375] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:52.802 [2024-07-15 13:33:40.191382] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:52.802 [2024-07-15 13:33:40.191390] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:52.802 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:52.802 [2024-07-15 13:33:40.373741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:52.802 BaseBdev1 00:10:52.802 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:52.802 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:52.802 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:52.802 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:52.803 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:52.803 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:52.803 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:53.059 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:53.316 [ 00:10:53.316 { 00:10:53.316 "name": "BaseBdev1", 00:10:53.316 "aliases": [ 00:10:53.316 "fc04b10e-cd13-406a-b926-9369d368a415" 00:10:53.316 ], 00:10:53.316 "product_name": "Malloc disk", 00:10:53.316 "block_size": 512, 00:10:53.316 "num_blocks": 65536, 00:10:53.316 "uuid": "fc04b10e-cd13-406a-b926-9369d368a415", 00:10:53.316 "assigned_rate_limits": { 00:10:53.316 "rw_ios_per_sec": 0, 00:10:53.316 "rw_mbytes_per_sec": 0, 00:10:53.316 "r_mbytes_per_sec": 0, 00:10:53.316 "w_mbytes_per_sec": 0 00:10:53.316 }, 00:10:53.316 "claimed": true, 00:10:53.316 "claim_type": "exclusive_write", 00:10:53.316 "zoned": false, 00:10:53.316 "supported_io_types": { 00:10:53.316 "read": true, 00:10:53.316 "write": true, 00:10:53.316 "unmap": true, 00:10:53.316 "flush": true, 00:10:53.316 "reset": true, 00:10:53.316 "nvme_admin": false, 00:10:53.316 "nvme_io": false, 00:10:53.316 "nvme_io_md": false, 00:10:53.316 "write_zeroes": true, 00:10:53.316 "zcopy": true, 00:10:53.316 "get_zone_info": false, 00:10:53.316 "zone_management": false, 00:10:53.316 "zone_append": false, 00:10:53.316 "compare": false, 00:10:53.316 "compare_and_write": false, 00:10:53.316 "abort": true, 00:10:53.316 "seek_hole": false, 00:10:53.316 "seek_data": false, 00:10:53.316 "copy": true, 00:10:53.316 "nvme_iov_md": false 00:10:53.316 }, 00:10:53.316 "memory_domains": [ 00:10:53.316 { 00:10:53.316 "dma_device_id": "system", 00:10:53.316 "dma_device_type": 1 00:10:53.316 }, 00:10:53.316 { 00:10:53.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.316 "dma_device_type": 2 00:10:53.316 } 00:10:53.316 ], 00:10:53.316 "driver_specific": {} 00:10:53.316 } 00:10:53.316 ] 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.316 "name": "Existed_Raid", 00:10:53.316 "uuid": "ef090588-9231-42ed-a0df-b986bf8111e2", 00:10:53.316 "strip_size_kb": 0, 00:10:53.316 "state": "configuring", 00:10:53.316 "raid_level": "raid1", 00:10:53.316 "superblock": true, 00:10:53.316 "num_base_bdevs": 2, 00:10:53.316 "num_base_bdevs_discovered": 1, 00:10:53.316 "num_base_bdevs_operational": 2, 00:10:53.316 "base_bdevs_list": [ 00:10:53.316 { 00:10:53.316 "name": "BaseBdev1", 00:10:53.316 "uuid": "fc04b10e-cd13-406a-b926-9369d368a415", 00:10:53.316 "is_configured": true, 00:10:53.316 "data_offset": 2048, 00:10:53.316 "data_size": 63488 00:10:53.316 }, 00:10:53.316 { 00:10:53.316 "name": "BaseBdev2", 00:10:53.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:53.316 "is_configured": false, 00:10:53.316 "data_offset": 0, 00:10:53.316 "data_size": 0 00:10:53.316 } 00:10:53.316 ] 00:10:53.316 }' 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.316 13:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:53.879 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:54.137 [2024-07-15 13:33:41.552780] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:54.137 [2024-07-15 13:33:41.552811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x199c820 name Existed_Raid, state configuring 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:54.137 [2024-07-15 13:33:41.733276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:54.137 [2024-07-15 13:33:41.734320] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:54.137 [2024-07-15 13:33:41.734351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:54.137 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.396 "name": "Existed_Raid", 00:10:54.396 "uuid": "08fe1c4a-b56c-40f0-bfe4-6eefb8b8e0de", 00:10:54.396 "strip_size_kb": 0, 00:10:54.396 "state": "configuring", 00:10:54.396 "raid_level": "raid1", 00:10:54.396 "superblock": true, 00:10:54.396 "num_base_bdevs": 2, 00:10:54.396 "num_base_bdevs_discovered": 1, 00:10:54.396 "num_base_bdevs_operational": 2, 00:10:54.396 "base_bdevs_list": [ 00:10:54.396 { 00:10:54.396 "name": "BaseBdev1", 00:10:54.396 "uuid": "fc04b10e-cd13-406a-b926-9369d368a415", 00:10:54.396 "is_configured": true, 00:10:54.396 "data_offset": 2048, 00:10:54.396 "data_size": 63488 00:10:54.396 }, 00:10:54.396 { 00:10:54.396 "name": "BaseBdev2", 00:10:54.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:54.396 "is_configured": false, 00:10:54.396 "data_offset": 0, 00:10:54.396 "data_size": 0 00:10:54.396 } 00:10:54.396 ] 00:10:54.396 }' 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.396 13:33:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:54.964 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:54.964 [2024-07-15 13:33:42.570198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:54.964 [2024-07-15 13:33:42.570309] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x199d610 00:10:54.964 [2024-07-15 13:33:42.570319] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:54.964 [2024-07-15 13:33:42.570434] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b51150 00:10:54.964 [2024-07-15 13:33:42.570517] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x199d610 00:10:54.964 [2024-07-15 13:33:42.570527] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x199d610 00:10:54.964 [2024-07-15 13:33:42.570588] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:54.964 BaseBdev2 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:55.223 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:55.483 [ 00:10:55.483 { 00:10:55.483 "name": "BaseBdev2", 00:10:55.483 "aliases": [ 00:10:55.483 "e33a2aea-bc6c-4127-b58f-34fe6f46cfb3" 00:10:55.483 ], 00:10:55.483 "product_name": "Malloc disk", 00:10:55.483 "block_size": 512, 00:10:55.483 "num_blocks": 65536, 00:10:55.483 "uuid": "e33a2aea-bc6c-4127-b58f-34fe6f46cfb3", 00:10:55.483 "assigned_rate_limits": { 00:10:55.483 "rw_ios_per_sec": 0, 00:10:55.483 "rw_mbytes_per_sec": 0, 00:10:55.483 "r_mbytes_per_sec": 0, 00:10:55.483 "w_mbytes_per_sec": 0 00:10:55.483 }, 00:10:55.483 "claimed": true, 00:10:55.483 "claim_type": "exclusive_write", 00:10:55.483 "zoned": false, 00:10:55.483 "supported_io_types": { 00:10:55.483 "read": true, 00:10:55.483 "write": true, 00:10:55.483 "unmap": true, 00:10:55.483 "flush": true, 00:10:55.483 "reset": true, 00:10:55.483 "nvme_admin": false, 00:10:55.483 "nvme_io": false, 00:10:55.483 "nvme_io_md": false, 00:10:55.483 "write_zeroes": true, 00:10:55.483 "zcopy": true, 00:10:55.483 "get_zone_info": false, 00:10:55.483 "zone_management": false, 00:10:55.483 "zone_append": false, 00:10:55.483 "compare": false, 00:10:55.483 "compare_and_write": false, 00:10:55.483 "abort": true, 00:10:55.483 "seek_hole": false, 00:10:55.483 "seek_data": false, 00:10:55.483 "copy": true, 00:10:55.483 "nvme_iov_md": false 00:10:55.483 }, 00:10:55.483 "memory_domains": [ 00:10:55.483 { 00:10:55.483 "dma_device_id": "system", 00:10:55.483 "dma_device_type": 1 00:10:55.483 }, 00:10:55.483 { 00:10:55.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.483 "dma_device_type": 2 00:10:55.483 } 00:10:55.483 ], 00:10:55.483 "driver_specific": {} 00:10:55.483 } 00:10:55.483 ] 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:55.483 13:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:55.742 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:55.742 "name": "Existed_Raid", 00:10:55.742 "uuid": "08fe1c4a-b56c-40f0-bfe4-6eefb8b8e0de", 00:10:55.742 "strip_size_kb": 0, 00:10:55.742 "state": "online", 00:10:55.742 "raid_level": "raid1", 00:10:55.742 "superblock": true, 00:10:55.742 "num_base_bdevs": 2, 00:10:55.742 "num_base_bdevs_discovered": 2, 00:10:55.742 "num_base_bdevs_operational": 2, 00:10:55.742 "base_bdevs_list": [ 00:10:55.742 { 00:10:55.742 "name": "BaseBdev1", 00:10:55.742 "uuid": "fc04b10e-cd13-406a-b926-9369d368a415", 00:10:55.742 "is_configured": true, 00:10:55.742 "data_offset": 2048, 00:10:55.742 "data_size": 63488 00:10:55.742 }, 00:10:55.742 { 00:10:55.742 "name": "BaseBdev2", 00:10:55.742 "uuid": "e33a2aea-bc6c-4127-b58f-34fe6f46cfb3", 00:10:55.742 "is_configured": true, 00:10:55.742 "data_offset": 2048, 00:10:55.742 "data_size": 63488 00:10:55.742 } 00:10:55.742 ] 00:10:55.742 }' 00:10:55.742 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:55.742 13:33:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:56.000 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:56.000 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:56.000 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:56.000 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:56.001 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:56.001 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:56.001 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:56.001 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:56.259 [2024-07-15 13:33:43.761453] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:56.259 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:56.259 "name": "Existed_Raid", 00:10:56.259 "aliases": [ 00:10:56.259 "08fe1c4a-b56c-40f0-bfe4-6eefb8b8e0de" 00:10:56.259 ], 00:10:56.259 "product_name": "Raid Volume", 00:10:56.259 "block_size": 512, 00:10:56.259 "num_blocks": 63488, 00:10:56.259 "uuid": "08fe1c4a-b56c-40f0-bfe4-6eefb8b8e0de", 00:10:56.259 "assigned_rate_limits": { 00:10:56.259 "rw_ios_per_sec": 0, 00:10:56.259 "rw_mbytes_per_sec": 0, 00:10:56.259 "r_mbytes_per_sec": 0, 00:10:56.259 "w_mbytes_per_sec": 0 00:10:56.259 }, 00:10:56.259 "claimed": false, 00:10:56.259 "zoned": false, 00:10:56.259 "supported_io_types": { 00:10:56.259 "read": true, 00:10:56.259 "write": true, 00:10:56.259 "unmap": false, 00:10:56.259 "flush": false, 00:10:56.259 "reset": true, 00:10:56.259 "nvme_admin": false, 00:10:56.259 "nvme_io": false, 00:10:56.259 "nvme_io_md": false, 00:10:56.259 "write_zeroes": true, 00:10:56.259 "zcopy": false, 00:10:56.259 "get_zone_info": false, 00:10:56.259 "zone_management": false, 00:10:56.259 "zone_append": false, 00:10:56.259 "compare": false, 00:10:56.259 "compare_and_write": false, 00:10:56.259 "abort": false, 00:10:56.259 "seek_hole": false, 00:10:56.259 "seek_data": false, 00:10:56.259 "copy": false, 00:10:56.259 "nvme_iov_md": false 00:10:56.259 }, 00:10:56.259 "memory_domains": [ 00:10:56.259 { 00:10:56.259 "dma_device_id": "system", 00:10:56.259 "dma_device_type": 1 00:10:56.259 }, 00:10:56.259 { 00:10:56.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.259 "dma_device_type": 2 00:10:56.259 }, 00:10:56.259 { 00:10:56.259 "dma_device_id": "system", 00:10:56.259 "dma_device_type": 1 00:10:56.259 }, 00:10:56.259 { 00:10:56.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.259 "dma_device_type": 2 00:10:56.259 } 00:10:56.259 ], 00:10:56.259 "driver_specific": { 00:10:56.259 "raid": { 00:10:56.259 "uuid": "08fe1c4a-b56c-40f0-bfe4-6eefb8b8e0de", 00:10:56.259 "strip_size_kb": 0, 00:10:56.259 "state": "online", 00:10:56.259 "raid_level": "raid1", 00:10:56.259 "superblock": true, 00:10:56.259 "num_base_bdevs": 2, 00:10:56.259 "num_base_bdevs_discovered": 2, 00:10:56.259 "num_base_bdevs_operational": 2, 00:10:56.259 "base_bdevs_list": [ 00:10:56.259 { 00:10:56.259 "name": "BaseBdev1", 00:10:56.259 "uuid": "fc04b10e-cd13-406a-b926-9369d368a415", 00:10:56.259 "is_configured": true, 00:10:56.259 "data_offset": 2048, 00:10:56.259 "data_size": 63488 00:10:56.259 }, 00:10:56.259 { 00:10:56.259 "name": "BaseBdev2", 00:10:56.259 "uuid": "e33a2aea-bc6c-4127-b58f-34fe6f46cfb3", 00:10:56.259 "is_configured": true, 00:10:56.259 "data_offset": 2048, 00:10:56.259 "data_size": 63488 00:10:56.259 } 00:10:56.259 ] 00:10:56.259 } 00:10:56.259 } 00:10:56.259 }' 00:10:56.259 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:56.259 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:56.259 BaseBdev2' 00:10:56.259 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:56.259 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:56.259 13:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:56.518 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:56.518 "name": "BaseBdev1", 00:10:56.518 "aliases": [ 00:10:56.518 "fc04b10e-cd13-406a-b926-9369d368a415" 00:10:56.518 ], 00:10:56.518 "product_name": "Malloc disk", 00:10:56.518 "block_size": 512, 00:10:56.518 "num_blocks": 65536, 00:10:56.518 "uuid": "fc04b10e-cd13-406a-b926-9369d368a415", 00:10:56.518 "assigned_rate_limits": { 00:10:56.518 "rw_ios_per_sec": 0, 00:10:56.518 "rw_mbytes_per_sec": 0, 00:10:56.518 "r_mbytes_per_sec": 0, 00:10:56.518 "w_mbytes_per_sec": 0 00:10:56.518 }, 00:10:56.518 "claimed": true, 00:10:56.518 "claim_type": "exclusive_write", 00:10:56.518 "zoned": false, 00:10:56.518 "supported_io_types": { 00:10:56.518 "read": true, 00:10:56.518 "write": true, 00:10:56.518 "unmap": true, 00:10:56.518 "flush": true, 00:10:56.518 "reset": true, 00:10:56.518 "nvme_admin": false, 00:10:56.518 "nvme_io": false, 00:10:56.518 "nvme_io_md": false, 00:10:56.518 "write_zeroes": true, 00:10:56.518 "zcopy": true, 00:10:56.518 "get_zone_info": false, 00:10:56.518 "zone_management": false, 00:10:56.518 "zone_append": false, 00:10:56.518 "compare": false, 00:10:56.518 "compare_and_write": false, 00:10:56.518 "abort": true, 00:10:56.518 "seek_hole": false, 00:10:56.518 "seek_data": false, 00:10:56.518 "copy": true, 00:10:56.518 "nvme_iov_md": false 00:10:56.518 }, 00:10:56.518 "memory_domains": [ 00:10:56.518 { 00:10:56.518 "dma_device_id": "system", 00:10:56.518 "dma_device_type": 1 00:10:56.518 }, 00:10:56.518 { 00:10:56.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.518 "dma_device_type": 2 00:10:56.518 } 00:10:56.518 ], 00:10:56.518 "driver_specific": {} 00:10:56.518 }' 00:10:56.518 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:56.518 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:56.518 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:56.518 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:56.518 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:56.777 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:57.036 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:57.036 "name": "BaseBdev2", 00:10:57.036 "aliases": [ 00:10:57.036 "e33a2aea-bc6c-4127-b58f-34fe6f46cfb3" 00:10:57.036 ], 00:10:57.036 "product_name": "Malloc disk", 00:10:57.036 "block_size": 512, 00:10:57.036 "num_blocks": 65536, 00:10:57.036 "uuid": "e33a2aea-bc6c-4127-b58f-34fe6f46cfb3", 00:10:57.036 "assigned_rate_limits": { 00:10:57.036 "rw_ios_per_sec": 0, 00:10:57.036 "rw_mbytes_per_sec": 0, 00:10:57.036 "r_mbytes_per_sec": 0, 00:10:57.036 "w_mbytes_per_sec": 0 00:10:57.036 }, 00:10:57.036 "claimed": true, 00:10:57.036 "claim_type": "exclusive_write", 00:10:57.036 "zoned": false, 00:10:57.036 "supported_io_types": { 00:10:57.036 "read": true, 00:10:57.036 "write": true, 00:10:57.036 "unmap": true, 00:10:57.036 "flush": true, 00:10:57.036 "reset": true, 00:10:57.036 "nvme_admin": false, 00:10:57.036 "nvme_io": false, 00:10:57.036 "nvme_io_md": false, 00:10:57.036 "write_zeroes": true, 00:10:57.036 "zcopy": true, 00:10:57.036 "get_zone_info": false, 00:10:57.036 "zone_management": false, 00:10:57.036 "zone_append": false, 00:10:57.036 "compare": false, 00:10:57.036 "compare_and_write": false, 00:10:57.036 "abort": true, 00:10:57.036 "seek_hole": false, 00:10:57.036 "seek_data": false, 00:10:57.036 "copy": true, 00:10:57.036 "nvme_iov_md": false 00:10:57.036 }, 00:10:57.036 "memory_domains": [ 00:10:57.036 { 00:10:57.036 "dma_device_id": "system", 00:10:57.036 "dma_device_type": 1 00:10:57.036 }, 00:10:57.036 { 00:10:57.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.036 "dma_device_type": 2 00:10:57.036 } 00:10:57.036 ], 00:10:57.036 "driver_specific": {} 00:10:57.036 }' 00:10:57.037 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:57.037 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:57.037 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:57.037 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:57.037 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:57.037 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:57.037 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:57.295 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:57.295 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:57.295 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:57.295 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:57.295 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:57.295 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:57.555 [2024-07-15 13:33:44.916306] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.555 13:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:57.555 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:57.555 "name": "Existed_Raid", 00:10:57.555 "uuid": "08fe1c4a-b56c-40f0-bfe4-6eefb8b8e0de", 00:10:57.555 "strip_size_kb": 0, 00:10:57.555 "state": "online", 00:10:57.555 "raid_level": "raid1", 00:10:57.555 "superblock": true, 00:10:57.555 "num_base_bdevs": 2, 00:10:57.555 "num_base_bdevs_discovered": 1, 00:10:57.555 "num_base_bdevs_operational": 1, 00:10:57.555 "base_bdevs_list": [ 00:10:57.555 { 00:10:57.555 "name": null, 00:10:57.555 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:57.555 "is_configured": false, 00:10:57.555 "data_offset": 2048, 00:10:57.555 "data_size": 63488 00:10:57.555 }, 00:10:57.555 { 00:10:57.555 "name": "BaseBdev2", 00:10:57.555 "uuid": "e33a2aea-bc6c-4127-b58f-34fe6f46cfb3", 00:10:57.555 "is_configured": true, 00:10:57.555 "data_offset": 2048, 00:10:57.555 "data_size": 63488 00:10:57.555 } 00:10:57.555 ] 00:10:57.555 }' 00:10:57.555 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:57.555 13:33:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:58.123 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:58.123 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:58.123 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:58.123 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.382 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:58.382 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:58.382 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:58.382 [2024-07-15 13:33:45.943754] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:58.382 [2024-07-15 13:33:45.943821] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:58.382 [2024-07-15 13:33:45.953870] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:58.382 [2024-07-15 13:33:45.953913] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:58.382 [2024-07-15 13:33:45.953922] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x199d610 name Existed_Raid, state offline 00:10:58.382 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:58.382 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:58.382 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.382 13:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4177445 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4177445 ']' 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4177445 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4177445 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4177445' 00:10:58.642 killing process with pid 4177445 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4177445 00:10:58.642 [2024-07-15 13:33:46.196291] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:58.642 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4177445 00:10:58.642 [2024-07-15 13:33:46.197090] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:58.901 13:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:58.901 00:10:58.901 real 0m8.215s 00:10:58.901 user 0m14.401s 00:10:58.901 sys 0m1.681s 00:10:58.901 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:58.901 13:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:58.901 ************************************ 00:10:58.901 END TEST raid_state_function_test_sb 00:10:58.901 ************************************ 00:10:58.901 13:33:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:58.901 13:33:46 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:10:58.901 13:33:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:58.902 13:33:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:58.902 13:33:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:58.902 ************************************ 00:10:58.902 START TEST raid_superblock_test 00:10:58.902 ************************************ 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4178820 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4178820 /var/tmp/spdk-raid.sock 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4178820 ']' 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:58.902 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:58.902 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.161 [2024-07-15 13:33:46.531097] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:10:59.161 [2024-07-15 13:33:46.531148] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4178820 ] 00:10:59.161 [2024-07-15 13:33:46.618883] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:59.161 [2024-07-15 13:33:46.710342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:59.161 [2024-07-15 13:33:46.768417] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.161 [2024-07-15 13:33:46.768443] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:59.726 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:59.984 malloc1 00:10:59.984 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:00.242 [2024-07-15 13:33:47.665325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:00.242 [2024-07-15 13:33:47.665363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:00.242 [2024-07-15 13:33:47.665395] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcf1260 00:11:00.242 [2024-07-15 13:33:47.665404] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:00.242 [2024-07-15 13:33:47.666686] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:00.242 [2024-07-15 13:33:47.666710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:00.242 pt1 00:11:00.242 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:00.243 malloc2 00:11:00.243 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:00.501 [2024-07-15 13:33:48.015273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:00.501 [2024-07-15 13:33:48.015309] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:00.501 [2024-07-15 13:33:48.015337] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe9b310 00:11:00.501 [2024-07-15 13:33:48.015350] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:00.501 [2024-07-15 13:33:48.016544] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:00.501 [2024-07-15 13:33:48.016567] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:00.501 pt2 00:11:00.501 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:00.501 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:00.501 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:00.761 [2024-07-15 13:33:48.191750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:00.761 [2024-07-15 13:33:48.192772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:00.761 [2024-07-15 13:33:48.192876] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe9a5b0 00:11:00.761 [2024-07-15 13:33:48.192885] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:00.761 [2024-07-15 13:33:48.193030] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe9ca10 00:11:00.761 [2024-07-15 13:33:48.193134] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe9a5b0 00:11:00.761 [2024-07-15 13:33:48.193141] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe9a5b0 00:11:00.761 [2024-07-15 13:33:48.193207] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.761 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:01.020 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.020 "name": "raid_bdev1", 00:11:01.020 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:01.020 "strip_size_kb": 0, 00:11:01.020 "state": "online", 00:11:01.020 "raid_level": "raid1", 00:11:01.020 "superblock": true, 00:11:01.020 "num_base_bdevs": 2, 00:11:01.020 "num_base_bdevs_discovered": 2, 00:11:01.020 "num_base_bdevs_operational": 2, 00:11:01.020 "base_bdevs_list": [ 00:11:01.020 { 00:11:01.020 "name": "pt1", 00:11:01.020 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:01.020 "is_configured": true, 00:11:01.020 "data_offset": 2048, 00:11:01.020 "data_size": 63488 00:11:01.020 }, 00:11:01.020 { 00:11:01.020 "name": "pt2", 00:11:01.020 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:01.020 "is_configured": true, 00:11:01.020 "data_offset": 2048, 00:11:01.020 "data_size": 63488 00:11:01.020 } 00:11:01.020 ] 00:11:01.020 }' 00:11:01.020 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.020 13:33:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:01.279 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:01.538 [2024-07-15 13:33:49.018008] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:01.538 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:01.538 "name": "raid_bdev1", 00:11:01.538 "aliases": [ 00:11:01.538 "a21f54d9-8166-4a1f-b504-ced0fdde7ab5" 00:11:01.538 ], 00:11:01.538 "product_name": "Raid Volume", 00:11:01.538 "block_size": 512, 00:11:01.538 "num_blocks": 63488, 00:11:01.538 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:01.538 "assigned_rate_limits": { 00:11:01.538 "rw_ios_per_sec": 0, 00:11:01.538 "rw_mbytes_per_sec": 0, 00:11:01.538 "r_mbytes_per_sec": 0, 00:11:01.538 "w_mbytes_per_sec": 0 00:11:01.538 }, 00:11:01.538 "claimed": false, 00:11:01.538 "zoned": false, 00:11:01.538 "supported_io_types": { 00:11:01.538 "read": true, 00:11:01.538 "write": true, 00:11:01.538 "unmap": false, 00:11:01.538 "flush": false, 00:11:01.538 "reset": true, 00:11:01.538 "nvme_admin": false, 00:11:01.538 "nvme_io": false, 00:11:01.538 "nvme_io_md": false, 00:11:01.538 "write_zeroes": true, 00:11:01.538 "zcopy": false, 00:11:01.538 "get_zone_info": false, 00:11:01.538 "zone_management": false, 00:11:01.538 "zone_append": false, 00:11:01.538 "compare": false, 00:11:01.538 "compare_and_write": false, 00:11:01.538 "abort": false, 00:11:01.538 "seek_hole": false, 00:11:01.538 "seek_data": false, 00:11:01.538 "copy": false, 00:11:01.538 "nvme_iov_md": false 00:11:01.538 }, 00:11:01.538 "memory_domains": [ 00:11:01.538 { 00:11:01.538 "dma_device_id": "system", 00:11:01.538 "dma_device_type": 1 00:11:01.538 }, 00:11:01.538 { 00:11:01.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:01.538 "dma_device_type": 2 00:11:01.538 }, 00:11:01.538 { 00:11:01.538 "dma_device_id": "system", 00:11:01.538 "dma_device_type": 1 00:11:01.538 }, 00:11:01.538 { 00:11:01.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:01.538 "dma_device_type": 2 00:11:01.538 } 00:11:01.538 ], 00:11:01.538 "driver_specific": { 00:11:01.538 "raid": { 00:11:01.538 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:01.538 "strip_size_kb": 0, 00:11:01.538 "state": "online", 00:11:01.538 "raid_level": "raid1", 00:11:01.538 "superblock": true, 00:11:01.538 "num_base_bdevs": 2, 00:11:01.538 "num_base_bdevs_discovered": 2, 00:11:01.538 "num_base_bdevs_operational": 2, 00:11:01.538 "base_bdevs_list": [ 00:11:01.538 { 00:11:01.538 "name": "pt1", 00:11:01.538 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:01.538 "is_configured": true, 00:11:01.538 "data_offset": 2048, 00:11:01.538 "data_size": 63488 00:11:01.538 }, 00:11:01.538 { 00:11:01.538 "name": "pt2", 00:11:01.538 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:01.538 "is_configured": true, 00:11:01.538 "data_offset": 2048, 00:11:01.538 "data_size": 63488 00:11:01.538 } 00:11:01.538 ] 00:11:01.538 } 00:11:01.538 } 00:11:01.538 }' 00:11:01.538 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:01.538 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:01.538 pt2' 00:11:01.538 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:01.538 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:01.538 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:01.797 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:01.797 "name": "pt1", 00:11:01.797 "aliases": [ 00:11:01.797 "00000000-0000-0000-0000-000000000001" 00:11:01.797 ], 00:11:01.797 "product_name": "passthru", 00:11:01.797 "block_size": 512, 00:11:01.797 "num_blocks": 65536, 00:11:01.797 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:01.797 "assigned_rate_limits": { 00:11:01.797 "rw_ios_per_sec": 0, 00:11:01.797 "rw_mbytes_per_sec": 0, 00:11:01.797 "r_mbytes_per_sec": 0, 00:11:01.797 "w_mbytes_per_sec": 0 00:11:01.797 }, 00:11:01.797 "claimed": true, 00:11:01.797 "claim_type": "exclusive_write", 00:11:01.797 "zoned": false, 00:11:01.797 "supported_io_types": { 00:11:01.797 "read": true, 00:11:01.797 "write": true, 00:11:01.797 "unmap": true, 00:11:01.797 "flush": true, 00:11:01.797 "reset": true, 00:11:01.797 "nvme_admin": false, 00:11:01.797 "nvme_io": false, 00:11:01.797 "nvme_io_md": false, 00:11:01.797 "write_zeroes": true, 00:11:01.797 "zcopy": true, 00:11:01.797 "get_zone_info": false, 00:11:01.797 "zone_management": false, 00:11:01.797 "zone_append": false, 00:11:01.797 "compare": false, 00:11:01.797 "compare_and_write": false, 00:11:01.797 "abort": true, 00:11:01.797 "seek_hole": false, 00:11:01.797 "seek_data": false, 00:11:01.797 "copy": true, 00:11:01.797 "nvme_iov_md": false 00:11:01.797 }, 00:11:01.797 "memory_domains": [ 00:11:01.797 { 00:11:01.797 "dma_device_id": "system", 00:11:01.797 "dma_device_type": 1 00:11:01.797 }, 00:11:01.797 { 00:11:01.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:01.797 "dma_device_type": 2 00:11:01.797 } 00:11:01.797 ], 00:11:01.797 "driver_specific": { 00:11:01.797 "passthru": { 00:11:01.797 "name": "pt1", 00:11:01.797 "base_bdev_name": "malloc1" 00:11:01.797 } 00:11:01.797 } 00:11:01.797 }' 00:11:01.797 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:01.797 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:01.797 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:01.797 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:01.797 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:02.056 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:02.315 "name": "pt2", 00:11:02.315 "aliases": [ 00:11:02.315 "00000000-0000-0000-0000-000000000002" 00:11:02.315 ], 00:11:02.315 "product_name": "passthru", 00:11:02.315 "block_size": 512, 00:11:02.315 "num_blocks": 65536, 00:11:02.315 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:02.315 "assigned_rate_limits": { 00:11:02.315 "rw_ios_per_sec": 0, 00:11:02.315 "rw_mbytes_per_sec": 0, 00:11:02.315 "r_mbytes_per_sec": 0, 00:11:02.315 "w_mbytes_per_sec": 0 00:11:02.315 }, 00:11:02.315 "claimed": true, 00:11:02.315 "claim_type": "exclusive_write", 00:11:02.315 "zoned": false, 00:11:02.315 "supported_io_types": { 00:11:02.315 "read": true, 00:11:02.315 "write": true, 00:11:02.315 "unmap": true, 00:11:02.315 "flush": true, 00:11:02.315 "reset": true, 00:11:02.315 "nvme_admin": false, 00:11:02.315 "nvme_io": false, 00:11:02.315 "nvme_io_md": false, 00:11:02.315 "write_zeroes": true, 00:11:02.315 "zcopy": true, 00:11:02.315 "get_zone_info": false, 00:11:02.315 "zone_management": false, 00:11:02.315 "zone_append": false, 00:11:02.315 "compare": false, 00:11:02.315 "compare_and_write": false, 00:11:02.315 "abort": true, 00:11:02.315 "seek_hole": false, 00:11:02.315 "seek_data": false, 00:11:02.315 "copy": true, 00:11:02.315 "nvme_iov_md": false 00:11:02.315 }, 00:11:02.315 "memory_domains": [ 00:11:02.315 { 00:11:02.315 "dma_device_id": "system", 00:11:02.315 "dma_device_type": 1 00:11:02.315 }, 00:11:02.315 { 00:11:02.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:02.315 "dma_device_type": 2 00:11:02.315 } 00:11:02.315 ], 00:11:02.315 "driver_specific": { 00:11:02.315 "passthru": { 00:11:02.315 "name": "pt2", 00:11:02.315 "base_bdev_name": "malloc2" 00:11:02.315 } 00:11:02.315 } 00:11:02.315 }' 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:02.315 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:02.573 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:02.573 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:02.573 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:02.573 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:02.573 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:02.573 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:02.573 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:02.832 [2024-07-15 13:33:50.209086] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:02.832 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a21f54d9-8166-4a1f-b504-ced0fdde7ab5 00:11:02.832 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a21f54d9-8166-4a1f-b504-ced0fdde7ab5 ']' 00:11:02.832 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:02.832 [2024-07-15 13:33:50.381363] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:02.832 [2024-07-15 13:33:50.381381] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:02.832 [2024-07-15 13:33:50.381421] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:02.832 [2024-07-15 13:33:50.381459] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:02.832 [2024-07-15 13:33:50.381467] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe9a5b0 name raid_bdev1, state offline 00:11:02.832 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.832 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:03.091 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:03.091 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:03.091 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:03.091 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:03.350 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:03.350 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:03.350 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:03.350 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:03.609 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:03.868 [2024-07-15 13:33:51.259608] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:03.868 [2024-07-15 13:33:51.260637] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:03.868 [2024-07-15 13:33:51.260681] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:03.868 [2024-07-15 13:33:51.260712] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:03.868 [2024-07-15 13:33:51.260725] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:03.868 [2024-07-15 13:33:51.260732] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe9bd40 name raid_bdev1, state configuring 00:11:03.868 request: 00:11:03.868 { 00:11:03.868 "name": "raid_bdev1", 00:11:03.868 "raid_level": "raid1", 00:11:03.868 "base_bdevs": [ 00:11:03.868 "malloc1", 00:11:03.868 "malloc2" 00:11:03.868 ], 00:11:03.868 "superblock": false, 00:11:03.868 "method": "bdev_raid_create", 00:11:03.868 "req_id": 1 00:11:03.868 } 00:11:03.868 Got JSON-RPC error response 00:11:03.868 response: 00:11:03.868 { 00:11:03.868 "code": -17, 00:11:03.868 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:03.868 } 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:03.868 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:04.128 [2024-07-15 13:33:51.608497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:04.128 [2024-07-15 13:33:51.608544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:04.128 [2024-07-15 13:33:51.608560] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcefba0 00:11:04.128 [2024-07-15 13:33:51.608570] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:04.128 [2024-07-15 13:33:51.609741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:04.128 [2024-07-15 13:33:51.609765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:04.128 [2024-07-15 13:33:51.609816] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:04.128 [2024-07-15 13:33:51.609834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:04.128 pt1 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.128 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:04.387 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:04.387 "name": "raid_bdev1", 00:11:04.387 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:04.387 "strip_size_kb": 0, 00:11:04.387 "state": "configuring", 00:11:04.387 "raid_level": "raid1", 00:11:04.387 "superblock": true, 00:11:04.387 "num_base_bdevs": 2, 00:11:04.387 "num_base_bdevs_discovered": 1, 00:11:04.387 "num_base_bdevs_operational": 2, 00:11:04.387 "base_bdevs_list": [ 00:11:04.387 { 00:11:04.387 "name": "pt1", 00:11:04.387 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:04.387 "is_configured": true, 00:11:04.387 "data_offset": 2048, 00:11:04.387 "data_size": 63488 00:11:04.387 }, 00:11:04.387 { 00:11:04.387 "name": null, 00:11:04.387 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:04.387 "is_configured": false, 00:11:04.387 "data_offset": 2048, 00:11:04.387 "data_size": 63488 00:11:04.387 } 00:11:04.387 ] 00:11:04.387 }' 00:11:04.387 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:04.387 13:33:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:04.954 [2024-07-15 13:33:52.434618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:04.954 [2024-07-15 13:33:52.434659] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:04.954 [2024-07-15 13:33:52.434673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcf1e80 00:11:04.954 [2024-07-15 13:33:52.434681] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:04.954 [2024-07-15 13:33:52.434921] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:04.954 [2024-07-15 13:33:52.434933] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:04.954 [2024-07-15 13:33:52.434982] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:04.954 [2024-07-15 13:33:52.435001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:04.954 [2024-07-15 13:33:52.435077] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe9d920 00:11:04.954 [2024-07-15 13:33:52.435083] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:04.954 [2024-07-15 13:33:52.435192] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea1110 00:11:04.954 [2024-07-15 13:33:52.435276] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe9d920 00:11:04.954 [2024-07-15 13:33:52.435282] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe9d920 00:11:04.954 [2024-07-15 13:33:52.435346] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.954 pt2 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:04.954 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.213 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:05.213 "name": "raid_bdev1", 00:11:05.213 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:05.213 "strip_size_kb": 0, 00:11:05.213 "state": "online", 00:11:05.213 "raid_level": "raid1", 00:11:05.213 "superblock": true, 00:11:05.213 "num_base_bdevs": 2, 00:11:05.213 "num_base_bdevs_discovered": 2, 00:11:05.213 "num_base_bdevs_operational": 2, 00:11:05.213 "base_bdevs_list": [ 00:11:05.213 { 00:11:05.213 "name": "pt1", 00:11:05.213 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:05.213 "is_configured": true, 00:11:05.213 "data_offset": 2048, 00:11:05.213 "data_size": 63488 00:11:05.213 }, 00:11:05.213 { 00:11:05.213 "name": "pt2", 00:11:05.213 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:05.213 "is_configured": true, 00:11:05.213 "data_offset": 2048, 00:11:05.213 "data_size": 63488 00:11:05.213 } 00:11:05.213 ] 00:11:05.213 }' 00:11:05.213 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:05.213 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:05.782 [2024-07-15 13:33:53.284981] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:05.782 "name": "raid_bdev1", 00:11:05.782 "aliases": [ 00:11:05.782 "a21f54d9-8166-4a1f-b504-ced0fdde7ab5" 00:11:05.782 ], 00:11:05.782 "product_name": "Raid Volume", 00:11:05.782 "block_size": 512, 00:11:05.782 "num_blocks": 63488, 00:11:05.782 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:05.782 "assigned_rate_limits": { 00:11:05.782 "rw_ios_per_sec": 0, 00:11:05.782 "rw_mbytes_per_sec": 0, 00:11:05.782 "r_mbytes_per_sec": 0, 00:11:05.782 "w_mbytes_per_sec": 0 00:11:05.782 }, 00:11:05.782 "claimed": false, 00:11:05.782 "zoned": false, 00:11:05.782 "supported_io_types": { 00:11:05.782 "read": true, 00:11:05.782 "write": true, 00:11:05.782 "unmap": false, 00:11:05.782 "flush": false, 00:11:05.782 "reset": true, 00:11:05.782 "nvme_admin": false, 00:11:05.782 "nvme_io": false, 00:11:05.782 "nvme_io_md": false, 00:11:05.782 "write_zeroes": true, 00:11:05.782 "zcopy": false, 00:11:05.782 "get_zone_info": false, 00:11:05.782 "zone_management": false, 00:11:05.782 "zone_append": false, 00:11:05.782 "compare": false, 00:11:05.782 "compare_and_write": false, 00:11:05.782 "abort": false, 00:11:05.782 "seek_hole": false, 00:11:05.782 "seek_data": false, 00:11:05.782 "copy": false, 00:11:05.782 "nvme_iov_md": false 00:11:05.782 }, 00:11:05.782 "memory_domains": [ 00:11:05.782 { 00:11:05.782 "dma_device_id": "system", 00:11:05.782 "dma_device_type": 1 00:11:05.782 }, 00:11:05.782 { 00:11:05.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.782 "dma_device_type": 2 00:11:05.782 }, 00:11:05.782 { 00:11:05.782 "dma_device_id": "system", 00:11:05.782 "dma_device_type": 1 00:11:05.782 }, 00:11:05.782 { 00:11:05.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.782 "dma_device_type": 2 00:11:05.782 } 00:11:05.782 ], 00:11:05.782 "driver_specific": { 00:11:05.782 "raid": { 00:11:05.782 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:05.782 "strip_size_kb": 0, 00:11:05.782 "state": "online", 00:11:05.782 "raid_level": "raid1", 00:11:05.782 "superblock": true, 00:11:05.782 "num_base_bdevs": 2, 00:11:05.782 "num_base_bdevs_discovered": 2, 00:11:05.782 "num_base_bdevs_operational": 2, 00:11:05.782 "base_bdevs_list": [ 00:11:05.782 { 00:11:05.782 "name": "pt1", 00:11:05.782 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:05.782 "is_configured": true, 00:11:05.782 "data_offset": 2048, 00:11:05.782 "data_size": 63488 00:11:05.782 }, 00:11:05.782 { 00:11:05.782 "name": "pt2", 00:11:05.782 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:05.782 "is_configured": true, 00:11:05.782 "data_offset": 2048, 00:11:05.782 "data_size": 63488 00:11:05.782 } 00:11:05.782 ] 00:11:05.782 } 00:11:05.782 } 00:11:05.782 }' 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:05.782 pt2' 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:05.782 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:06.043 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:06.043 "name": "pt1", 00:11:06.043 "aliases": [ 00:11:06.043 "00000000-0000-0000-0000-000000000001" 00:11:06.043 ], 00:11:06.043 "product_name": "passthru", 00:11:06.043 "block_size": 512, 00:11:06.043 "num_blocks": 65536, 00:11:06.043 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:06.043 "assigned_rate_limits": { 00:11:06.043 "rw_ios_per_sec": 0, 00:11:06.043 "rw_mbytes_per_sec": 0, 00:11:06.043 "r_mbytes_per_sec": 0, 00:11:06.043 "w_mbytes_per_sec": 0 00:11:06.043 }, 00:11:06.043 "claimed": true, 00:11:06.043 "claim_type": "exclusive_write", 00:11:06.043 "zoned": false, 00:11:06.043 "supported_io_types": { 00:11:06.043 "read": true, 00:11:06.043 "write": true, 00:11:06.043 "unmap": true, 00:11:06.043 "flush": true, 00:11:06.043 "reset": true, 00:11:06.043 "nvme_admin": false, 00:11:06.043 "nvme_io": false, 00:11:06.043 "nvme_io_md": false, 00:11:06.043 "write_zeroes": true, 00:11:06.043 "zcopy": true, 00:11:06.043 "get_zone_info": false, 00:11:06.043 "zone_management": false, 00:11:06.043 "zone_append": false, 00:11:06.043 "compare": false, 00:11:06.043 "compare_and_write": false, 00:11:06.043 "abort": true, 00:11:06.043 "seek_hole": false, 00:11:06.043 "seek_data": false, 00:11:06.043 "copy": true, 00:11:06.043 "nvme_iov_md": false 00:11:06.043 }, 00:11:06.043 "memory_domains": [ 00:11:06.043 { 00:11:06.043 "dma_device_id": "system", 00:11:06.043 "dma_device_type": 1 00:11:06.043 }, 00:11:06.043 { 00:11:06.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.043 "dma_device_type": 2 00:11:06.043 } 00:11:06.043 ], 00:11:06.043 "driver_specific": { 00:11:06.043 "passthru": { 00:11:06.043 "name": "pt1", 00:11:06.043 "base_bdev_name": "malloc1" 00:11:06.043 } 00:11:06.043 } 00:11:06.043 }' 00:11:06.043 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.043 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.043 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:06.043 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.043 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:06.301 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:06.559 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:06.559 "name": "pt2", 00:11:06.559 "aliases": [ 00:11:06.559 "00000000-0000-0000-0000-000000000002" 00:11:06.559 ], 00:11:06.559 "product_name": "passthru", 00:11:06.559 "block_size": 512, 00:11:06.559 "num_blocks": 65536, 00:11:06.559 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:06.559 "assigned_rate_limits": { 00:11:06.559 "rw_ios_per_sec": 0, 00:11:06.559 "rw_mbytes_per_sec": 0, 00:11:06.559 "r_mbytes_per_sec": 0, 00:11:06.559 "w_mbytes_per_sec": 0 00:11:06.559 }, 00:11:06.559 "claimed": true, 00:11:06.559 "claim_type": "exclusive_write", 00:11:06.559 "zoned": false, 00:11:06.559 "supported_io_types": { 00:11:06.559 "read": true, 00:11:06.559 "write": true, 00:11:06.559 "unmap": true, 00:11:06.559 "flush": true, 00:11:06.559 "reset": true, 00:11:06.559 "nvme_admin": false, 00:11:06.559 "nvme_io": false, 00:11:06.559 "nvme_io_md": false, 00:11:06.559 "write_zeroes": true, 00:11:06.559 "zcopy": true, 00:11:06.559 "get_zone_info": false, 00:11:06.559 "zone_management": false, 00:11:06.559 "zone_append": false, 00:11:06.559 "compare": false, 00:11:06.559 "compare_and_write": false, 00:11:06.559 "abort": true, 00:11:06.559 "seek_hole": false, 00:11:06.559 "seek_data": false, 00:11:06.559 "copy": true, 00:11:06.559 "nvme_iov_md": false 00:11:06.559 }, 00:11:06.559 "memory_domains": [ 00:11:06.560 { 00:11:06.560 "dma_device_id": "system", 00:11:06.560 "dma_device_type": 1 00:11:06.560 }, 00:11:06.560 { 00:11:06.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.560 "dma_device_type": 2 00:11:06.560 } 00:11:06.560 ], 00:11:06.560 "driver_specific": { 00:11:06.560 "passthru": { 00:11:06.560 "name": "pt2", 00:11:06.560 "base_bdev_name": "malloc2" 00:11:06.560 } 00:11:06.560 } 00:11:06.560 }' 00:11:06.560 13:33:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.560 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.560 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:06.560 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.560 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.560 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:06.560 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.818 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.818 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.818 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.818 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.818 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.818 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:06.818 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:07.087 [2024-07-15 13:33:54.444006] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a21f54d9-8166-4a1f-b504-ced0fdde7ab5 '!=' a21f54d9-8166-4a1f-b504-ced0fdde7ab5 ']' 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:07.087 [2024-07-15 13:33:54.604276] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.087 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:07.346 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:07.346 "name": "raid_bdev1", 00:11:07.346 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:07.346 "strip_size_kb": 0, 00:11:07.346 "state": "online", 00:11:07.346 "raid_level": "raid1", 00:11:07.346 "superblock": true, 00:11:07.346 "num_base_bdevs": 2, 00:11:07.346 "num_base_bdevs_discovered": 1, 00:11:07.346 "num_base_bdevs_operational": 1, 00:11:07.346 "base_bdevs_list": [ 00:11:07.346 { 00:11:07.346 "name": null, 00:11:07.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:07.346 "is_configured": false, 00:11:07.346 "data_offset": 2048, 00:11:07.346 "data_size": 63488 00:11:07.346 }, 00:11:07.346 { 00:11:07.346 "name": "pt2", 00:11:07.346 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:07.346 "is_configured": true, 00:11:07.346 "data_offset": 2048, 00:11:07.346 "data_size": 63488 00:11:07.346 } 00:11:07.346 ] 00:11:07.346 }' 00:11:07.346 13:33:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:07.346 13:33:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.915 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:07.915 [2024-07-15 13:33:55.386262] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:07.915 [2024-07-15 13:33:55.386284] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:07.915 [2024-07-15 13:33:55.386322] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:07.915 [2024-07-15 13:33:55.386350] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:07.915 [2024-07-15 13:33:55.386357] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe9d920 name raid_bdev1, state offline 00:11:07.915 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:11:07.915 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:11:08.174 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:08.436 [2024-07-15 13:33:55.927636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:08.436 [2024-07-15 13:33:55.927672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:08.436 [2024-07-15 13:33:55.927702] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcf1490 00:11:08.436 [2024-07-15 13:33:55.927710] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:08.436 [2024-07-15 13:33:55.928874] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:08.436 [2024-07-15 13:33:55.928896] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:08.436 [2024-07-15 13:33:55.928944] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:08.436 [2024-07-15 13:33:55.928964] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:08.436 [2024-07-15 13:33:55.929034] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xea2360 00:11:08.436 [2024-07-15 13:33:55.929042] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:08.436 [2024-07-15 13:33:55.929155] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcf2860 00:11:08.436 [2024-07-15 13:33:55.929240] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xea2360 00:11:08.436 [2024-07-15 13:33:55.929247] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xea2360 00:11:08.436 [2024-07-15 13:33:55.929312] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:08.436 pt2 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.436 13:33:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:08.695 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:08.695 "name": "raid_bdev1", 00:11:08.695 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:08.695 "strip_size_kb": 0, 00:11:08.695 "state": "online", 00:11:08.695 "raid_level": "raid1", 00:11:08.695 "superblock": true, 00:11:08.695 "num_base_bdevs": 2, 00:11:08.695 "num_base_bdevs_discovered": 1, 00:11:08.695 "num_base_bdevs_operational": 1, 00:11:08.695 "base_bdevs_list": [ 00:11:08.695 { 00:11:08.695 "name": null, 00:11:08.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:08.695 "is_configured": false, 00:11:08.695 "data_offset": 2048, 00:11:08.695 "data_size": 63488 00:11:08.695 }, 00:11:08.695 { 00:11:08.695 "name": "pt2", 00:11:08.695 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:08.695 "is_configured": true, 00:11:08.695 "data_offset": 2048, 00:11:08.695 "data_size": 63488 00:11:08.695 } 00:11:08.695 ] 00:11:08.695 }' 00:11:08.695 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:08.695 13:33:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.261 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:09.261 [2024-07-15 13:33:56.793865] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:09.261 [2024-07-15 13:33:56.793887] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:09.261 [2024-07-15 13:33:56.793933] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:09.261 [2024-07-15 13:33:56.793967] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:09.261 [2024-07-15 13:33:56.793975] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xea2360 name raid_bdev1, state offline 00:11:09.261 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:11:09.261 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.520 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:11:09.520 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:11:09.520 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:11:09.520 13:33:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:09.779 [2024-07-15 13:33:57.150798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:09.779 [2024-07-15 13:33:57.150829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:09.779 [2024-07-15 13:33:57.150842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe9b540 00:11:09.779 [2024-07-15 13:33:57.150850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:09.779 [2024-07-15 13:33:57.152046] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:09.779 [2024-07-15 13:33:57.152068] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:09.779 [2024-07-15 13:33:57.152115] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:09.779 [2024-07-15 13:33:57.152134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:09.779 [2024-07-15 13:33:57.152203] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:11:09.779 [2024-07-15 13:33:57.152212] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:09.779 [2024-07-15 13:33:57.152221] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xea0230 name raid_bdev1, state configuring 00:11:09.779 [2024-07-15 13:33:57.152237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:09.779 [2024-07-15 13:33:57.152277] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xea0d40 00:11:09.779 [2024-07-15 13:33:57.152284] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:09.779 [2024-07-15 13:33:57.152399] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe9c020 00:11:09.779 [2024-07-15 13:33:57.152484] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xea0d40 00:11:09.779 [2024-07-15 13:33:57.152491] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xea0d40 00:11:09.779 [2024-07-15 13:33:57.152557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:09.779 pt1 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.779 "name": "raid_bdev1", 00:11:09.779 "uuid": "a21f54d9-8166-4a1f-b504-ced0fdde7ab5", 00:11:09.779 "strip_size_kb": 0, 00:11:09.779 "state": "online", 00:11:09.779 "raid_level": "raid1", 00:11:09.779 "superblock": true, 00:11:09.779 "num_base_bdevs": 2, 00:11:09.779 "num_base_bdevs_discovered": 1, 00:11:09.779 "num_base_bdevs_operational": 1, 00:11:09.779 "base_bdevs_list": [ 00:11:09.779 { 00:11:09.779 "name": null, 00:11:09.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.779 "is_configured": false, 00:11:09.779 "data_offset": 2048, 00:11:09.779 "data_size": 63488 00:11:09.779 }, 00:11:09.779 { 00:11:09.779 "name": "pt2", 00:11:09.779 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:09.779 "is_configured": true, 00:11:09.779 "data_offset": 2048, 00:11:09.779 "data_size": 63488 00:11:09.779 } 00:11:09.779 ] 00:11:09.779 }' 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.779 13:33:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.346 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:10.347 13:33:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:11:10.606 [2024-07-15 13:33:58.189633] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' a21f54d9-8166-4a1f-b504-ced0fdde7ab5 '!=' a21f54d9-8166-4a1f-b504-ced0fdde7ab5 ']' 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4178820 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4178820 ']' 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4178820 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:10.606 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4178820 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4178820' 00:11:10.864 killing process with pid 4178820 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4178820 00:11:10.864 [2024-07-15 13:33:58.254315] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:10.864 [2024-07-15 13:33:58.254357] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:10.864 [2024-07-15 13:33:58.254388] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:10.864 [2024-07-15 13:33:58.254397] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xea0d40 name raid_bdev1, state offline 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4178820 00:11:10.864 [2024-07-15 13:33:58.272266] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:10.864 00:11:10.864 real 0m11.982s 00:11:10.864 user 0m21.558s 00:11:10.864 sys 0m2.372s 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:10.864 13:33:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.864 ************************************ 00:11:10.864 END TEST raid_superblock_test 00:11:10.864 ************************************ 00:11:11.123 13:33:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:11.123 13:33:58 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:11:11.123 13:33:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:11.123 13:33:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:11.123 13:33:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:11.123 ************************************ 00:11:11.123 START TEST raid_read_error_test 00:11:11.123 ************************************ 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:11.123 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.QJD4t87nD5 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4180693 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4180693 /var/tmp/spdk-raid.sock 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4180693 ']' 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:11.124 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:11.124 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.124 [2024-07-15 13:33:58.591875] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:11:11.124 [2024-07-15 13:33:58.591925] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4180693 ] 00:11:11.124 [2024-07-15 13:33:58.675882] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:11.382 [2024-07-15 13:33:58.767710] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:11.382 [2024-07-15 13:33:58.826979] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.382 [2024-07-15 13:33:58.827014] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.948 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:11.948 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:11.948 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:11.948 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:11.948 BaseBdev1_malloc 00:11:12.206 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:12.206 true 00:11:12.206 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:12.465 [2024-07-15 13:33:59.877657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:12.465 [2024-07-15 13:33:59.877690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.465 [2024-07-15 13:33:59.877723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeef990 00:11:12.465 [2024-07-15 13:33:59.877733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.465 [2024-07-15 13:33:59.879068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.465 [2024-07-15 13:33:59.879090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:12.465 BaseBdev1 00:11:12.465 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:12.465 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:12.465 BaseBdev2_malloc 00:11:12.465 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:12.761 true 00:11:12.761 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:13.037 [2024-07-15 13:34:00.415398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:13.037 [2024-07-15 13:34:00.415435] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:13.037 [2024-07-15 13:34:00.415452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef41d0 00:11:13.038 [2024-07-15 13:34:00.415460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:13.038 [2024-07-15 13:34:00.416679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:13.038 [2024-07-15 13:34:00.416702] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:13.038 BaseBdev2 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:13.038 [2024-07-15 13:34:00.587866] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:13.038 [2024-07-15 13:34:00.588924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:13.038 [2024-07-15 13:34:00.589090] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xef5be0 00:11:13.038 [2024-07-15 13:34:00.589101] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:13.038 [2024-07-15 13:34:00.589246] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd5d5c0 00:11:13.038 [2024-07-15 13:34:00.589359] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xef5be0 00:11:13.038 [2024-07-15 13:34:00.589366] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xef5be0 00:11:13.038 [2024-07-15 13:34:00.589445] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.038 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:13.311 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.311 "name": "raid_bdev1", 00:11:13.311 "uuid": "1f666658-a241-474a-a242-3a95ac9787c5", 00:11:13.311 "strip_size_kb": 0, 00:11:13.311 "state": "online", 00:11:13.311 "raid_level": "raid1", 00:11:13.311 "superblock": true, 00:11:13.311 "num_base_bdevs": 2, 00:11:13.311 "num_base_bdevs_discovered": 2, 00:11:13.311 "num_base_bdevs_operational": 2, 00:11:13.311 "base_bdevs_list": [ 00:11:13.311 { 00:11:13.311 "name": "BaseBdev1", 00:11:13.311 "uuid": "8aee6521-f992-547c-ba9a-5a06a13da2a6", 00:11:13.311 "is_configured": true, 00:11:13.311 "data_offset": 2048, 00:11:13.311 "data_size": 63488 00:11:13.311 }, 00:11:13.311 { 00:11:13.311 "name": "BaseBdev2", 00:11:13.311 "uuid": "5231d6fa-fd3c-5fe8-8f04-a355a410b8c9", 00:11:13.311 "is_configured": true, 00:11:13.311 "data_offset": 2048, 00:11:13.311 "data_size": 63488 00:11:13.311 } 00:11:13.311 ] 00:11:13.311 }' 00:11:13.311 13:34:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.311 13:34:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:13.874 13:34:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:13.874 13:34:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:13.874 [2024-07-15 13:34:01.338045] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xef1530 00:11:14.805 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:15.062 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.063 "name": "raid_bdev1", 00:11:15.063 "uuid": "1f666658-a241-474a-a242-3a95ac9787c5", 00:11:15.063 "strip_size_kb": 0, 00:11:15.063 "state": "online", 00:11:15.063 "raid_level": "raid1", 00:11:15.063 "superblock": true, 00:11:15.063 "num_base_bdevs": 2, 00:11:15.063 "num_base_bdevs_discovered": 2, 00:11:15.063 "num_base_bdevs_operational": 2, 00:11:15.063 "base_bdevs_list": [ 00:11:15.063 { 00:11:15.063 "name": "BaseBdev1", 00:11:15.063 "uuid": "8aee6521-f992-547c-ba9a-5a06a13da2a6", 00:11:15.063 "is_configured": true, 00:11:15.063 "data_offset": 2048, 00:11:15.063 "data_size": 63488 00:11:15.063 }, 00:11:15.063 { 00:11:15.063 "name": "BaseBdev2", 00:11:15.063 "uuid": "5231d6fa-fd3c-5fe8-8f04-a355a410b8c9", 00:11:15.063 "is_configured": true, 00:11:15.063 "data_offset": 2048, 00:11:15.063 "data_size": 63488 00:11:15.063 } 00:11:15.063 ] 00:11:15.063 }' 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.063 13:34:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.641 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:15.898 [2024-07-15 13:34:03.286969] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:15.898 [2024-07-15 13:34:03.287006] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:15.898 [2024-07-15 13:34:03.289097] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:15.898 [2024-07-15 13:34:03.289118] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:15.898 [2024-07-15 13:34:03.289170] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:15.898 [2024-07-15 13:34:03.289178] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xef5be0 name raid_bdev1, state offline 00:11:15.898 0 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4180693 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4180693 ']' 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4180693 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4180693 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4180693' 00:11:15.898 killing process with pid 4180693 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4180693 00:11:15.898 [2024-07-15 13:34:03.357040] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:15.898 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4180693 00:11:15.898 [2024-07-15 13:34:03.367456] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.QJD4t87nD5 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:16.160 00:11:16.160 real 0m5.046s 00:11:16.160 user 0m7.598s 00:11:16.160 sys 0m0.890s 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:16.160 13:34:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:16.160 ************************************ 00:11:16.160 END TEST raid_read_error_test 00:11:16.160 ************************************ 00:11:16.160 13:34:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:16.160 13:34:03 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:11:16.160 13:34:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:16.160 13:34:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:16.160 13:34:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:16.160 ************************************ 00:11:16.160 START TEST raid_write_error_test 00:11:16.160 ************************************ 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:16.160 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.1HTsaXRLjr 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4181462 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4181462 /var/tmp/spdk-raid.sock 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4181462 ']' 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:16.161 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:16.161 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:16.161 [2024-07-15 13:34:03.738111] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:11:16.161 [2024-07-15 13:34:03.738169] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4181462 ] 00:11:16.425 [2024-07-15 13:34:03.826073] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:16.425 [2024-07-15 13:34:03.920707] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:16.425 [2024-07-15 13:34:03.982280] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:16.425 [2024-07-15 13:34:03.982306] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:16.988 13:34:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:16.988 13:34:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:16.988 13:34:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:16.988 13:34:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:17.245 BaseBdev1_malloc 00:11:17.245 13:34:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:17.503 true 00:11:17.503 13:34:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:17.503 [2024-07-15 13:34:05.051612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:17.503 [2024-07-15 13:34:05.051649] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:17.503 [2024-07-15 13:34:05.051662] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x255d990 00:11:17.503 [2024-07-15 13:34:05.051671] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:17.503 [2024-07-15 13:34:05.052858] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:17.503 [2024-07-15 13:34:05.052880] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:17.503 BaseBdev1 00:11:17.503 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:17.503 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:17.762 BaseBdev2_malloc 00:11:17.762 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:18.020 true 00:11:18.020 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:18.020 [2024-07-15 13:34:05.596983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:18.020 [2024-07-15 13:34:05.597042] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:18.020 [2024-07-15 13:34:05.597059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25621d0 00:11:18.020 [2024-07-15 13:34:05.597068] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:18.020 [2024-07-15 13:34:05.598187] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:18.020 [2024-07-15 13:34:05.598212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:18.020 BaseBdev2 00:11:18.020 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:18.279 [2024-07-15 13:34:05.781479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:18.279 [2024-07-15 13:34:05.782343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:18.279 [2024-07-15 13:34:05.782481] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2563be0 00:11:18.279 [2024-07-15 13:34:05.782491] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:18.279 [2024-07-15 13:34:05.782625] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23cb5c0 00:11:18.279 [2024-07-15 13:34:05.782732] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2563be0 00:11:18.279 [2024-07-15 13:34:05.782739] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2563be0 00:11:18.279 [2024-07-15 13:34:05.782812] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.279 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:18.538 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.538 "name": "raid_bdev1", 00:11:18.538 "uuid": "4eb70b94-1ed2-4bc7-afef-24e4775d52ea", 00:11:18.538 "strip_size_kb": 0, 00:11:18.538 "state": "online", 00:11:18.538 "raid_level": "raid1", 00:11:18.538 "superblock": true, 00:11:18.538 "num_base_bdevs": 2, 00:11:18.538 "num_base_bdevs_discovered": 2, 00:11:18.538 "num_base_bdevs_operational": 2, 00:11:18.538 "base_bdevs_list": [ 00:11:18.538 { 00:11:18.538 "name": "BaseBdev1", 00:11:18.538 "uuid": "a0242566-195a-53a4-a666-748239bd852f", 00:11:18.538 "is_configured": true, 00:11:18.538 "data_offset": 2048, 00:11:18.538 "data_size": 63488 00:11:18.538 }, 00:11:18.538 { 00:11:18.538 "name": "BaseBdev2", 00:11:18.538 "uuid": "e2919b06-6faa-516d-a53d-9490f7310943", 00:11:18.538 "is_configured": true, 00:11:18.538 "data_offset": 2048, 00:11:18.538 "data_size": 63488 00:11:18.538 } 00:11:18.538 ] 00:11:18.538 }' 00:11:18.538 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.538 13:34:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.104 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:19.104 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:19.104 [2024-07-15 13:34:06.531650] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x255f530 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:20.039 [2024-07-15 13:34:07.620766] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:11:20.039 [2024-07-15 13:34:07.620820] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:20.039 [2024-07-15 13:34:07.620993] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x255f530 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.039 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:20.298 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.298 "name": "raid_bdev1", 00:11:20.298 "uuid": "4eb70b94-1ed2-4bc7-afef-24e4775d52ea", 00:11:20.298 "strip_size_kb": 0, 00:11:20.298 "state": "online", 00:11:20.298 "raid_level": "raid1", 00:11:20.298 "superblock": true, 00:11:20.298 "num_base_bdevs": 2, 00:11:20.298 "num_base_bdevs_discovered": 1, 00:11:20.298 "num_base_bdevs_operational": 1, 00:11:20.298 "base_bdevs_list": [ 00:11:20.298 { 00:11:20.298 "name": null, 00:11:20.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.298 "is_configured": false, 00:11:20.298 "data_offset": 2048, 00:11:20.298 "data_size": 63488 00:11:20.298 }, 00:11:20.298 { 00:11:20.298 "name": "BaseBdev2", 00:11:20.298 "uuid": "e2919b06-6faa-516d-a53d-9490f7310943", 00:11:20.298 "is_configured": true, 00:11:20.298 "data_offset": 2048, 00:11:20.298 "data_size": 63488 00:11:20.298 } 00:11:20.298 ] 00:11:20.298 }' 00:11:20.298 13:34:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.298 13:34:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.864 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:20.864 [2024-07-15 13:34:08.449621] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:20.864 [2024-07-15 13:34:08.449653] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:20.864 [2024-07-15 13:34:08.451789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:20.864 [2024-07-15 13:34:08.451810] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:20.864 [2024-07-15 13:34:08.451850] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:20.864 [2024-07-15 13:34:08.451858] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2563be0 name raid_bdev1, state offline 00:11:20.864 0 00:11:20.864 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4181462 00:11:20.864 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4181462 ']' 00:11:20.864 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4181462 00:11:20.864 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:20.864 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:20.864 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4181462 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4181462' 00:11:21.123 killing process with pid 4181462 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4181462 00:11:21.123 [2024-07-15 13:34:08.508629] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4181462 00:11:21.123 [2024-07-15 13:34:08.519507] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.1HTsaXRLjr 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:21.123 00:11:21.123 real 0m5.070s 00:11:21.123 user 0m7.610s 00:11:21.123 sys 0m0.895s 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:21.123 13:34:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.123 ************************************ 00:11:21.123 END TEST raid_write_error_test 00:11:21.123 ************************************ 00:11:21.382 13:34:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:21.382 13:34:08 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:21.382 13:34:08 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:21.382 13:34:08 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:11:21.382 13:34:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:21.382 13:34:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:21.382 13:34:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:21.382 ************************************ 00:11:21.382 START TEST raid_state_function_test 00:11:21.382 ************************************ 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4182262 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4182262' 00:11:21.382 Process raid pid: 4182262 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4182262 /var/tmp/spdk-raid.sock 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4182262 ']' 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:21.382 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:21.382 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.382 [2024-07-15 13:34:08.886680] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:11:21.382 [2024-07-15 13:34:08.886741] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:21.382 [2024-07-15 13:34:08.976160] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:21.640 [2024-07-15 13:34:09.066365] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:21.640 [2024-07-15 13:34:09.120839] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.640 [2024-07-15 13:34:09.120866] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:22.206 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:22.206 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:22.206 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:22.463 [2024-07-15 13:34:09.848034] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:22.463 [2024-07-15 13:34:09.848070] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:22.463 [2024-07-15 13:34:09.848077] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:22.463 [2024-07-15 13:34:09.848101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:22.463 [2024-07-15 13:34:09.848107] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:22.463 [2024-07-15 13:34:09.848114] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:22.463 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:22.463 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.463 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:22.463 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.463 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.463 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:22.464 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.464 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.464 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.464 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.464 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.464 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.464 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.464 "name": "Existed_Raid", 00:11:22.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.464 "strip_size_kb": 64, 00:11:22.464 "state": "configuring", 00:11:22.464 "raid_level": "raid0", 00:11:22.464 "superblock": false, 00:11:22.464 "num_base_bdevs": 3, 00:11:22.464 "num_base_bdevs_discovered": 0, 00:11:22.464 "num_base_bdevs_operational": 3, 00:11:22.464 "base_bdevs_list": [ 00:11:22.464 { 00:11:22.464 "name": "BaseBdev1", 00:11:22.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.464 "is_configured": false, 00:11:22.464 "data_offset": 0, 00:11:22.464 "data_size": 0 00:11:22.464 }, 00:11:22.464 { 00:11:22.464 "name": "BaseBdev2", 00:11:22.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.464 "is_configured": false, 00:11:22.464 "data_offset": 0, 00:11:22.464 "data_size": 0 00:11:22.464 }, 00:11:22.464 { 00:11:22.464 "name": "BaseBdev3", 00:11:22.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.464 "is_configured": false, 00:11:22.464 "data_offset": 0, 00:11:22.464 "data_size": 0 00:11:22.464 } 00:11:22.464 ] 00:11:22.464 }' 00:11:22.464 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.464 13:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.029 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:23.287 [2024-07-15 13:34:10.710157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:23.287 [2024-07-15 13:34:10.710183] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2756f50 name Existed_Raid, state configuring 00:11:23.287 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:23.287 [2024-07-15 13:34:10.890629] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:23.287 [2024-07-15 13:34:10.890654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:23.287 [2024-07-15 13:34:10.890660] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:23.287 [2024-07-15 13:34:10.890667] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:23.287 [2024-07-15 13:34:10.890689] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:23.287 [2024-07-15 13:34:10.890697] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:23.545 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:23.545 [2024-07-15 13:34:11.067840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:23.545 BaseBdev1 00:11:23.545 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:23.545 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:23.545 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:23.545 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:23.545 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:23.545 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:23.545 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:23.804 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:23.804 [ 00:11:23.804 { 00:11:23.804 "name": "BaseBdev1", 00:11:23.804 "aliases": [ 00:11:23.804 "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73" 00:11:23.804 ], 00:11:23.804 "product_name": "Malloc disk", 00:11:23.804 "block_size": 512, 00:11:23.804 "num_blocks": 65536, 00:11:23.804 "uuid": "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73", 00:11:23.804 "assigned_rate_limits": { 00:11:23.804 "rw_ios_per_sec": 0, 00:11:23.804 "rw_mbytes_per_sec": 0, 00:11:23.804 "r_mbytes_per_sec": 0, 00:11:23.804 "w_mbytes_per_sec": 0 00:11:23.804 }, 00:11:23.804 "claimed": true, 00:11:23.804 "claim_type": "exclusive_write", 00:11:23.804 "zoned": false, 00:11:23.804 "supported_io_types": { 00:11:23.804 "read": true, 00:11:23.804 "write": true, 00:11:23.804 "unmap": true, 00:11:23.804 "flush": true, 00:11:23.804 "reset": true, 00:11:23.804 "nvme_admin": false, 00:11:23.804 "nvme_io": false, 00:11:23.804 "nvme_io_md": false, 00:11:23.804 "write_zeroes": true, 00:11:23.804 "zcopy": true, 00:11:23.804 "get_zone_info": false, 00:11:23.804 "zone_management": false, 00:11:23.804 "zone_append": false, 00:11:23.804 "compare": false, 00:11:23.804 "compare_and_write": false, 00:11:23.804 "abort": true, 00:11:23.804 "seek_hole": false, 00:11:23.804 "seek_data": false, 00:11:23.804 "copy": true, 00:11:23.804 "nvme_iov_md": false 00:11:23.804 }, 00:11:23.804 "memory_domains": [ 00:11:23.804 { 00:11:23.804 "dma_device_id": "system", 00:11:23.804 "dma_device_type": 1 00:11:23.804 }, 00:11:23.804 { 00:11:23.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.804 "dma_device_type": 2 00:11:23.804 } 00:11:23.804 ], 00:11:23.804 "driver_specific": {} 00:11:23.804 } 00:11:23.804 ] 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:24.063 "name": "Existed_Raid", 00:11:24.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:24.063 "strip_size_kb": 64, 00:11:24.063 "state": "configuring", 00:11:24.063 "raid_level": "raid0", 00:11:24.063 "superblock": false, 00:11:24.063 "num_base_bdevs": 3, 00:11:24.063 "num_base_bdevs_discovered": 1, 00:11:24.063 "num_base_bdevs_operational": 3, 00:11:24.063 "base_bdevs_list": [ 00:11:24.063 { 00:11:24.063 "name": "BaseBdev1", 00:11:24.063 "uuid": "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73", 00:11:24.063 "is_configured": true, 00:11:24.063 "data_offset": 0, 00:11:24.063 "data_size": 65536 00:11:24.063 }, 00:11:24.063 { 00:11:24.063 "name": "BaseBdev2", 00:11:24.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:24.063 "is_configured": false, 00:11:24.063 "data_offset": 0, 00:11:24.063 "data_size": 0 00:11:24.063 }, 00:11:24.063 { 00:11:24.063 "name": "BaseBdev3", 00:11:24.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:24.063 "is_configured": false, 00:11:24.063 "data_offset": 0, 00:11:24.063 "data_size": 0 00:11:24.063 } 00:11:24.063 ] 00:11:24.063 }' 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:24.063 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:24.631 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:24.631 [2024-07-15 13:34:12.222982] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:24.631 [2024-07-15 13:34:12.223033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2756820 name Existed_Raid, state configuring 00:11:24.631 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:24.890 [2024-07-15 13:34:12.403473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:24.890 [2024-07-15 13:34:12.404532] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:24.890 [2024-07-15 13:34:12.404562] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:24.890 [2024-07-15 13:34:12.404569] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:24.890 [2024-07-15 13:34:12.404577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.890 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.150 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.150 "name": "Existed_Raid", 00:11:25.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.150 "strip_size_kb": 64, 00:11:25.150 "state": "configuring", 00:11:25.150 "raid_level": "raid0", 00:11:25.150 "superblock": false, 00:11:25.150 "num_base_bdevs": 3, 00:11:25.150 "num_base_bdevs_discovered": 1, 00:11:25.150 "num_base_bdevs_operational": 3, 00:11:25.150 "base_bdevs_list": [ 00:11:25.150 { 00:11:25.150 "name": "BaseBdev1", 00:11:25.150 "uuid": "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73", 00:11:25.150 "is_configured": true, 00:11:25.150 "data_offset": 0, 00:11:25.150 "data_size": 65536 00:11:25.150 }, 00:11:25.150 { 00:11:25.150 "name": "BaseBdev2", 00:11:25.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.150 "is_configured": false, 00:11:25.150 "data_offset": 0, 00:11:25.150 "data_size": 0 00:11:25.150 }, 00:11:25.150 { 00:11:25.150 "name": "BaseBdev3", 00:11:25.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.150 "is_configured": false, 00:11:25.150 "data_offset": 0, 00:11:25.150 "data_size": 0 00:11:25.150 } 00:11:25.150 ] 00:11:25.150 }' 00:11:25.150 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.150 13:34:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:25.717 [2024-07-15 13:34:13.240468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:25.717 BaseBdev2 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:25.717 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:25.975 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:26.232 [ 00:11:26.232 { 00:11:26.232 "name": "BaseBdev2", 00:11:26.232 "aliases": [ 00:11:26.233 "9edb5b4d-2b90-4c4f-8247-37706987dacf" 00:11:26.233 ], 00:11:26.233 "product_name": "Malloc disk", 00:11:26.233 "block_size": 512, 00:11:26.233 "num_blocks": 65536, 00:11:26.233 "uuid": "9edb5b4d-2b90-4c4f-8247-37706987dacf", 00:11:26.233 "assigned_rate_limits": { 00:11:26.233 "rw_ios_per_sec": 0, 00:11:26.233 "rw_mbytes_per_sec": 0, 00:11:26.233 "r_mbytes_per_sec": 0, 00:11:26.233 "w_mbytes_per_sec": 0 00:11:26.233 }, 00:11:26.233 "claimed": true, 00:11:26.233 "claim_type": "exclusive_write", 00:11:26.233 "zoned": false, 00:11:26.233 "supported_io_types": { 00:11:26.233 "read": true, 00:11:26.233 "write": true, 00:11:26.233 "unmap": true, 00:11:26.233 "flush": true, 00:11:26.233 "reset": true, 00:11:26.233 "nvme_admin": false, 00:11:26.233 "nvme_io": false, 00:11:26.233 "nvme_io_md": false, 00:11:26.233 "write_zeroes": true, 00:11:26.233 "zcopy": true, 00:11:26.233 "get_zone_info": false, 00:11:26.233 "zone_management": false, 00:11:26.233 "zone_append": false, 00:11:26.233 "compare": false, 00:11:26.233 "compare_and_write": false, 00:11:26.233 "abort": true, 00:11:26.233 "seek_hole": false, 00:11:26.233 "seek_data": false, 00:11:26.233 "copy": true, 00:11:26.233 "nvme_iov_md": false 00:11:26.233 }, 00:11:26.233 "memory_domains": [ 00:11:26.233 { 00:11:26.233 "dma_device_id": "system", 00:11:26.233 "dma_device_type": 1 00:11:26.233 }, 00:11:26.233 { 00:11:26.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.233 "dma_device_type": 2 00:11:26.233 } 00:11:26.233 ], 00:11:26.233 "driver_specific": {} 00:11:26.233 } 00:11:26.233 ] 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.233 "name": "Existed_Raid", 00:11:26.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:26.233 "strip_size_kb": 64, 00:11:26.233 "state": "configuring", 00:11:26.233 "raid_level": "raid0", 00:11:26.233 "superblock": false, 00:11:26.233 "num_base_bdevs": 3, 00:11:26.233 "num_base_bdevs_discovered": 2, 00:11:26.233 "num_base_bdevs_operational": 3, 00:11:26.233 "base_bdevs_list": [ 00:11:26.233 { 00:11:26.233 "name": "BaseBdev1", 00:11:26.233 "uuid": "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73", 00:11:26.233 "is_configured": true, 00:11:26.233 "data_offset": 0, 00:11:26.233 "data_size": 65536 00:11:26.233 }, 00:11:26.233 { 00:11:26.233 "name": "BaseBdev2", 00:11:26.233 "uuid": "9edb5b4d-2b90-4c4f-8247-37706987dacf", 00:11:26.233 "is_configured": true, 00:11:26.233 "data_offset": 0, 00:11:26.233 "data_size": 65536 00:11:26.233 }, 00:11:26.233 { 00:11:26.233 "name": "BaseBdev3", 00:11:26.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:26.233 "is_configured": false, 00:11:26.233 "data_offset": 0, 00:11:26.233 "data_size": 0 00:11:26.233 } 00:11:26.233 ] 00:11:26.233 }' 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.233 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.800 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:27.058 [2024-07-15 13:34:14.450505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:27.058 [2024-07-15 13:34:14.450539] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2757710 00:11:27.058 [2024-07-15 13:34:14.450545] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:27.058 [2024-07-15 13:34:14.450680] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27573e0 00:11:27.058 [2024-07-15 13:34:14.450772] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2757710 00:11:27.058 [2024-07-15 13:34:14.450779] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2757710 00:11:27.058 [2024-07-15 13:34:14.450906] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:27.058 BaseBdev3 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:27.058 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:27.317 [ 00:11:27.317 { 00:11:27.317 "name": "BaseBdev3", 00:11:27.317 "aliases": [ 00:11:27.317 "2206f0eb-c372-4b5e-8f41-f0c6d51e668e" 00:11:27.317 ], 00:11:27.317 "product_name": "Malloc disk", 00:11:27.317 "block_size": 512, 00:11:27.317 "num_blocks": 65536, 00:11:27.317 "uuid": "2206f0eb-c372-4b5e-8f41-f0c6d51e668e", 00:11:27.317 "assigned_rate_limits": { 00:11:27.317 "rw_ios_per_sec": 0, 00:11:27.317 "rw_mbytes_per_sec": 0, 00:11:27.317 "r_mbytes_per_sec": 0, 00:11:27.317 "w_mbytes_per_sec": 0 00:11:27.317 }, 00:11:27.317 "claimed": true, 00:11:27.317 "claim_type": "exclusive_write", 00:11:27.317 "zoned": false, 00:11:27.317 "supported_io_types": { 00:11:27.317 "read": true, 00:11:27.317 "write": true, 00:11:27.317 "unmap": true, 00:11:27.317 "flush": true, 00:11:27.317 "reset": true, 00:11:27.317 "nvme_admin": false, 00:11:27.317 "nvme_io": false, 00:11:27.317 "nvme_io_md": false, 00:11:27.317 "write_zeroes": true, 00:11:27.317 "zcopy": true, 00:11:27.317 "get_zone_info": false, 00:11:27.317 "zone_management": false, 00:11:27.317 "zone_append": false, 00:11:27.317 "compare": false, 00:11:27.317 "compare_and_write": false, 00:11:27.317 "abort": true, 00:11:27.317 "seek_hole": false, 00:11:27.317 "seek_data": false, 00:11:27.317 "copy": true, 00:11:27.317 "nvme_iov_md": false 00:11:27.317 }, 00:11:27.317 "memory_domains": [ 00:11:27.317 { 00:11:27.317 "dma_device_id": "system", 00:11:27.317 "dma_device_type": 1 00:11:27.317 }, 00:11:27.317 { 00:11:27.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.317 "dma_device_type": 2 00:11:27.317 } 00:11:27.317 ], 00:11:27.317 "driver_specific": {} 00:11:27.317 } 00:11:27.317 ] 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.317 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:27.576 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:27.576 "name": "Existed_Raid", 00:11:27.576 "uuid": "07ce82c6-ed4a-4a5f-9901-38dfd9ed6830", 00:11:27.576 "strip_size_kb": 64, 00:11:27.576 "state": "online", 00:11:27.576 "raid_level": "raid0", 00:11:27.576 "superblock": false, 00:11:27.576 "num_base_bdevs": 3, 00:11:27.576 "num_base_bdevs_discovered": 3, 00:11:27.576 "num_base_bdevs_operational": 3, 00:11:27.576 "base_bdevs_list": [ 00:11:27.576 { 00:11:27.576 "name": "BaseBdev1", 00:11:27.576 "uuid": "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73", 00:11:27.576 "is_configured": true, 00:11:27.576 "data_offset": 0, 00:11:27.576 "data_size": 65536 00:11:27.576 }, 00:11:27.576 { 00:11:27.576 "name": "BaseBdev2", 00:11:27.576 "uuid": "9edb5b4d-2b90-4c4f-8247-37706987dacf", 00:11:27.576 "is_configured": true, 00:11:27.576 "data_offset": 0, 00:11:27.576 "data_size": 65536 00:11:27.576 }, 00:11:27.576 { 00:11:27.576 "name": "BaseBdev3", 00:11:27.576 "uuid": "2206f0eb-c372-4b5e-8f41-f0c6d51e668e", 00:11:27.576 "is_configured": true, 00:11:27.576 "data_offset": 0, 00:11:27.576 "data_size": 65536 00:11:27.576 } 00:11:27.576 ] 00:11:27.576 }' 00:11:27.576 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:27.576 13:34:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:27.835 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:28.094 [2024-07-15 13:34:15.601730] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:28.094 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:28.094 "name": "Existed_Raid", 00:11:28.094 "aliases": [ 00:11:28.094 "07ce82c6-ed4a-4a5f-9901-38dfd9ed6830" 00:11:28.094 ], 00:11:28.094 "product_name": "Raid Volume", 00:11:28.094 "block_size": 512, 00:11:28.094 "num_blocks": 196608, 00:11:28.094 "uuid": "07ce82c6-ed4a-4a5f-9901-38dfd9ed6830", 00:11:28.094 "assigned_rate_limits": { 00:11:28.094 "rw_ios_per_sec": 0, 00:11:28.094 "rw_mbytes_per_sec": 0, 00:11:28.094 "r_mbytes_per_sec": 0, 00:11:28.094 "w_mbytes_per_sec": 0 00:11:28.094 }, 00:11:28.094 "claimed": false, 00:11:28.094 "zoned": false, 00:11:28.094 "supported_io_types": { 00:11:28.094 "read": true, 00:11:28.094 "write": true, 00:11:28.094 "unmap": true, 00:11:28.094 "flush": true, 00:11:28.094 "reset": true, 00:11:28.094 "nvme_admin": false, 00:11:28.094 "nvme_io": false, 00:11:28.094 "nvme_io_md": false, 00:11:28.094 "write_zeroes": true, 00:11:28.094 "zcopy": false, 00:11:28.094 "get_zone_info": false, 00:11:28.094 "zone_management": false, 00:11:28.094 "zone_append": false, 00:11:28.094 "compare": false, 00:11:28.094 "compare_and_write": false, 00:11:28.094 "abort": false, 00:11:28.094 "seek_hole": false, 00:11:28.094 "seek_data": false, 00:11:28.094 "copy": false, 00:11:28.094 "nvme_iov_md": false 00:11:28.094 }, 00:11:28.094 "memory_domains": [ 00:11:28.094 { 00:11:28.094 "dma_device_id": "system", 00:11:28.094 "dma_device_type": 1 00:11:28.094 }, 00:11:28.094 { 00:11:28.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.094 "dma_device_type": 2 00:11:28.094 }, 00:11:28.094 { 00:11:28.094 "dma_device_id": "system", 00:11:28.094 "dma_device_type": 1 00:11:28.094 }, 00:11:28.094 { 00:11:28.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.094 "dma_device_type": 2 00:11:28.094 }, 00:11:28.094 { 00:11:28.094 "dma_device_id": "system", 00:11:28.094 "dma_device_type": 1 00:11:28.094 }, 00:11:28.094 { 00:11:28.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.094 "dma_device_type": 2 00:11:28.094 } 00:11:28.094 ], 00:11:28.094 "driver_specific": { 00:11:28.094 "raid": { 00:11:28.094 "uuid": "07ce82c6-ed4a-4a5f-9901-38dfd9ed6830", 00:11:28.094 "strip_size_kb": 64, 00:11:28.094 "state": "online", 00:11:28.094 "raid_level": "raid0", 00:11:28.094 "superblock": false, 00:11:28.094 "num_base_bdevs": 3, 00:11:28.094 "num_base_bdevs_discovered": 3, 00:11:28.094 "num_base_bdevs_operational": 3, 00:11:28.094 "base_bdevs_list": [ 00:11:28.094 { 00:11:28.094 "name": "BaseBdev1", 00:11:28.094 "uuid": "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73", 00:11:28.094 "is_configured": true, 00:11:28.094 "data_offset": 0, 00:11:28.094 "data_size": 65536 00:11:28.094 }, 00:11:28.094 { 00:11:28.094 "name": "BaseBdev2", 00:11:28.094 "uuid": "9edb5b4d-2b90-4c4f-8247-37706987dacf", 00:11:28.094 "is_configured": true, 00:11:28.094 "data_offset": 0, 00:11:28.094 "data_size": 65536 00:11:28.094 }, 00:11:28.094 { 00:11:28.094 "name": "BaseBdev3", 00:11:28.094 "uuid": "2206f0eb-c372-4b5e-8f41-f0c6d51e668e", 00:11:28.094 "is_configured": true, 00:11:28.094 "data_offset": 0, 00:11:28.094 "data_size": 65536 00:11:28.094 } 00:11:28.094 ] 00:11:28.094 } 00:11:28.094 } 00:11:28.094 }' 00:11:28.094 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:28.094 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:28.094 BaseBdev2 00:11:28.094 BaseBdev3' 00:11:28.094 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:28.094 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:28.094 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.353 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.353 "name": "BaseBdev1", 00:11:28.353 "aliases": [ 00:11:28.353 "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73" 00:11:28.353 ], 00:11:28.353 "product_name": "Malloc disk", 00:11:28.353 "block_size": 512, 00:11:28.353 "num_blocks": 65536, 00:11:28.353 "uuid": "f2e5f5fd-44f7-475b-9c11-67cfe3d15c73", 00:11:28.353 "assigned_rate_limits": { 00:11:28.353 "rw_ios_per_sec": 0, 00:11:28.353 "rw_mbytes_per_sec": 0, 00:11:28.353 "r_mbytes_per_sec": 0, 00:11:28.353 "w_mbytes_per_sec": 0 00:11:28.353 }, 00:11:28.353 "claimed": true, 00:11:28.353 "claim_type": "exclusive_write", 00:11:28.353 "zoned": false, 00:11:28.353 "supported_io_types": { 00:11:28.353 "read": true, 00:11:28.353 "write": true, 00:11:28.353 "unmap": true, 00:11:28.353 "flush": true, 00:11:28.353 "reset": true, 00:11:28.353 "nvme_admin": false, 00:11:28.353 "nvme_io": false, 00:11:28.353 "nvme_io_md": false, 00:11:28.353 "write_zeroes": true, 00:11:28.353 "zcopy": true, 00:11:28.353 "get_zone_info": false, 00:11:28.353 "zone_management": false, 00:11:28.353 "zone_append": false, 00:11:28.353 "compare": false, 00:11:28.353 "compare_and_write": false, 00:11:28.353 "abort": true, 00:11:28.353 "seek_hole": false, 00:11:28.353 "seek_data": false, 00:11:28.353 "copy": true, 00:11:28.353 "nvme_iov_md": false 00:11:28.353 }, 00:11:28.353 "memory_domains": [ 00:11:28.353 { 00:11:28.353 "dma_device_id": "system", 00:11:28.353 "dma_device_type": 1 00:11:28.353 }, 00:11:28.353 { 00:11:28.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.353 "dma_device_type": 2 00:11:28.353 } 00:11:28.353 ], 00:11:28.353 "driver_specific": {} 00:11:28.353 }' 00:11:28.353 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.353 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.353 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.353 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.353 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.612 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.612 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:28.612 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.871 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.871 "name": "BaseBdev2", 00:11:28.871 "aliases": [ 00:11:28.871 "9edb5b4d-2b90-4c4f-8247-37706987dacf" 00:11:28.871 ], 00:11:28.871 "product_name": "Malloc disk", 00:11:28.871 "block_size": 512, 00:11:28.871 "num_blocks": 65536, 00:11:28.871 "uuid": "9edb5b4d-2b90-4c4f-8247-37706987dacf", 00:11:28.871 "assigned_rate_limits": { 00:11:28.871 "rw_ios_per_sec": 0, 00:11:28.871 "rw_mbytes_per_sec": 0, 00:11:28.871 "r_mbytes_per_sec": 0, 00:11:28.871 "w_mbytes_per_sec": 0 00:11:28.871 }, 00:11:28.871 "claimed": true, 00:11:28.871 "claim_type": "exclusive_write", 00:11:28.871 "zoned": false, 00:11:28.871 "supported_io_types": { 00:11:28.871 "read": true, 00:11:28.871 "write": true, 00:11:28.871 "unmap": true, 00:11:28.871 "flush": true, 00:11:28.871 "reset": true, 00:11:28.871 "nvme_admin": false, 00:11:28.871 "nvme_io": false, 00:11:28.871 "nvme_io_md": false, 00:11:28.871 "write_zeroes": true, 00:11:28.871 "zcopy": true, 00:11:28.871 "get_zone_info": false, 00:11:28.871 "zone_management": false, 00:11:28.871 "zone_append": false, 00:11:28.871 "compare": false, 00:11:28.871 "compare_and_write": false, 00:11:28.871 "abort": true, 00:11:28.871 "seek_hole": false, 00:11:28.871 "seek_data": false, 00:11:28.871 "copy": true, 00:11:28.871 "nvme_iov_md": false 00:11:28.871 }, 00:11:28.871 "memory_domains": [ 00:11:28.871 { 00:11:28.871 "dma_device_id": "system", 00:11:28.871 "dma_device_type": 1 00:11:28.871 }, 00:11:28.871 { 00:11:28.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.871 "dma_device_type": 2 00:11:28.871 } 00:11:28.871 ], 00:11:28.871 "driver_specific": {} 00:11:28.871 }' 00:11:28.871 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.871 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.871 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.871 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.871 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:29.129 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:29.388 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:29.388 "name": "BaseBdev3", 00:11:29.388 "aliases": [ 00:11:29.388 "2206f0eb-c372-4b5e-8f41-f0c6d51e668e" 00:11:29.388 ], 00:11:29.388 "product_name": "Malloc disk", 00:11:29.388 "block_size": 512, 00:11:29.388 "num_blocks": 65536, 00:11:29.388 "uuid": "2206f0eb-c372-4b5e-8f41-f0c6d51e668e", 00:11:29.388 "assigned_rate_limits": { 00:11:29.388 "rw_ios_per_sec": 0, 00:11:29.388 "rw_mbytes_per_sec": 0, 00:11:29.388 "r_mbytes_per_sec": 0, 00:11:29.388 "w_mbytes_per_sec": 0 00:11:29.388 }, 00:11:29.388 "claimed": true, 00:11:29.388 "claim_type": "exclusive_write", 00:11:29.388 "zoned": false, 00:11:29.388 "supported_io_types": { 00:11:29.388 "read": true, 00:11:29.388 "write": true, 00:11:29.388 "unmap": true, 00:11:29.388 "flush": true, 00:11:29.388 "reset": true, 00:11:29.388 "nvme_admin": false, 00:11:29.388 "nvme_io": false, 00:11:29.388 "nvme_io_md": false, 00:11:29.388 "write_zeroes": true, 00:11:29.388 "zcopy": true, 00:11:29.388 "get_zone_info": false, 00:11:29.388 "zone_management": false, 00:11:29.388 "zone_append": false, 00:11:29.388 "compare": false, 00:11:29.388 "compare_and_write": false, 00:11:29.388 "abort": true, 00:11:29.388 "seek_hole": false, 00:11:29.388 "seek_data": false, 00:11:29.388 "copy": true, 00:11:29.388 "nvme_iov_md": false 00:11:29.388 }, 00:11:29.388 "memory_domains": [ 00:11:29.388 { 00:11:29.388 "dma_device_id": "system", 00:11:29.389 "dma_device_type": 1 00:11:29.389 }, 00:11:29.389 { 00:11:29.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.389 "dma_device_type": 2 00:11:29.389 } 00:11:29.389 ], 00:11:29.389 "driver_specific": {} 00:11:29.389 }' 00:11:29.389 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:29.389 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:29.389 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:29.389 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:29.389 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:29.389 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:29.389 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.647 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.647 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:29.647 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.647 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.647 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:29.647 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:29.905 [2024-07-15 13:34:17.298103] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:29.905 [2024-07-15 13:34:17.298127] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:29.905 [2024-07-15 13:34:17.298157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.905 "name": "Existed_Raid", 00:11:29.905 "uuid": "07ce82c6-ed4a-4a5f-9901-38dfd9ed6830", 00:11:29.905 "strip_size_kb": 64, 00:11:29.905 "state": "offline", 00:11:29.905 "raid_level": "raid0", 00:11:29.905 "superblock": false, 00:11:29.905 "num_base_bdevs": 3, 00:11:29.905 "num_base_bdevs_discovered": 2, 00:11:29.905 "num_base_bdevs_operational": 2, 00:11:29.905 "base_bdevs_list": [ 00:11:29.905 { 00:11:29.905 "name": null, 00:11:29.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.905 "is_configured": false, 00:11:29.905 "data_offset": 0, 00:11:29.905 "data_size": 65536 00:11:29.905 }, 00:11:29.905 { 00:11:29.905 "name": "BaseBdev2", 00:11:29.905 "uuid": "9edb5b4d-2b90-4c4f-8247-37706987dacf", 00:11:29.905 "is_configured": true, 00:11:29.905 "data_offset": 0, 00:11:29.905 "data_size": 65536 00:11:29.905 }, 00:11:29.905 { 00:11:29.905 "name": "BaseBdev3", 00:11:29.905 "uuid": "2206f0eb-c372-4b5e-8f41-f0c6d51e668e", 00:11:29.905 "is_configured": true, 00:11:29.905 "data_offset": 0, 00:11:29.905 "data_size": 65536 00:11:29.905 } 00:11:29.905 ] 00:11:29.905 }' 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.905 13:34:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.470 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:30.470 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:30.470 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.470 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:30.727 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:30.727 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:30.727 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:30.727 [2024-07-15 13:34:18.325667] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:30.984 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:30.984 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:30.984 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.984 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:30.984 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:30.984 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:30.984 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:31.241 [2024-07-15 13:34:18.689481] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:31.241 [2024-07-15 13:34:18.689522] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2757710 name Existed_Raid, state offline 00:11:31.241 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:31.241 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:31.241 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.241 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:31.499 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:31.499 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:31.499 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:31.499 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:31.499 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:31.499 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:31.499 BaseBdev2 00:11:31.499 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:31.499 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:31.499 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:31.499 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:31.499 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:31.499 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:31.499 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:31.756 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:32.013 [ 00:11:32.013 { 00:11:32.013 "name": "BaseBdev2", 00:11:32.013 "aliases": [ 00:11:32.013 "b0e67b51-c713-41ac-856f-3b0ba5630097" 00:11:32.013 ], 00:11:32.013 "product_name": "Malloc disk", 00:11:32.013 "block_size": 512, 00:11:32.013 "num_blocks": 65536, 00:11:32.013 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:32.013 "assigned_rate_limits": { 00:11:32.013 "rw_ios_per_sec": 0, 00:11:32.013 "rw_mbytes_per_sec": 0, 00:11:32.013 "r_mbytes_per_sec": 0, 00:11:32.013 "w_mbytes_per_sec": 0 00:11:32.013 }, 00:11:32.013 "claimed": false, 00:11:32.013 "zoned": false, 00:11:32.013 "supported_io_types": { 00:11:32.013 "read": true, 00:11:32.014 "write": true, 00:11:32.014 "unmap": true, 00:11:32.014 "flush": true, 00:11:32.014 "reset": true, 00:11:32.014 "nvme_admin": false, 00:11:32.014 "nvme_io": false, 00:11:32.014 "nvme_io_md": false, 00:11:32.014 "write_zeroes": true, 00:11:32.014 "zcopy": true, 00:11:32.014 "get_zone_info": false, 00:11:32.014 "zone_management": false, 00:11:32.014 "zone_append": false, 00:11:32.014 "compare": false, 00:11:32.014 "compare_and_write": false, 00:11:32.014 "abort": true, 00:11:32.014 "seek_hole": false, 00:11:32.014 "seek_data": false, 00:11:32.014 "copy": true, 00:11:32.014 "nvme_iov_md": false 00:11:32.014 }, 00:11:32.014 "memory_domains": [ 00:11:32.014 { 00:11:32.014 "dma_device_id": "system", 00:11:32.014 "dma_device_type": 1 00:11:32.014 }, 00:11:32.014 { 00:11:32.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.014 "dma_device_type": 2 00:11:32.014 } 00:11:32.014 ], 00:11:32.014 "driver_specific": {} 00:11:32.014 } 00:11:32.014 ] 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:32.014 BaseBdev3 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:32.014 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:32.271 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:32.528 [ 00:11:32.528 { 00:11:32.528 "name": "BaseBdev3", 00:11:32.528 "aliases": [ 00:11:32.528 "4a07c324-fc98-4bcd-92d0-7654c35b5ddd" 00:11:32.528 ], 00:11:32.528 "product_name": "Malloc disk", 00:11:32.528 "block_size": 512, 00:11:32.528 "num_blocks": 65536, 00:11:32.528 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:32.528 "assigned_rate_limits": { 00:11:32.528 "rw_ios_per_sec": 0, 00:11:32.528 "rw_mbytes_per_sec": 0, 00:11:32.528 "r_mbytes_per_sec": 0, 00:11:32.528 "w_mbytes_per_sec": 0 00:11:32.528 }, 00:11:32.528 "claimed": false, 00:11:32.528 "zoned": false, 00:11:32.528 "supported_io_types": { 00:11:32.528 "read": true, 00:11:32.528 "write": true, 00:11:32.528 "unmap": true, 00:11:32.528 "flush": true, 00:11:32.528 "reset": true, 00:11:32.528 "nvme_admin": false, 00:11:32.528 "nvme_io": false, 00:11:32.528 "nvme_io_md": false, 00:11:32.528 "write_zeroes": true, 00:11:32.528 "zcopy": true, 00:11:32.528 "get_zone_info": false, 00:11:32.528 "zone_management": false, 00:11:32.528 "zone_append": false, 00:11:32.528 "compare": false, 00:11:32.528 "compare_and_write": false, 00:11:32.528 "abort": true, 00:11:32.528 "seek_hole": false, 00:11:32.528 "seek_data": false, 00:11:32.528 "copy": true, 00:11:32.528 "nvme_iov_md": false 00:11:32.528 }, 00:11:32.528 "memory_domains": [ 00:11:32.528 { 00:11:32.528 "dma_device_id": "system", 00:11:32.528 "dma_device_type": 1 00:11:32.528 }, 00:11:32.528 { 00:11:32.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.528 "dma_device_type": 2 00:11:32.528 } 00:11:32.528 ], 00:11:32.528 "driver_specific": {} 00:11:32.528 } 00:11:32.528 ] 00:11:32.528 13:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:32.528 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:32.528 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:32.528 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:32.528 [2024-07-15 13:34:20.092736] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:32.528 [2024-07-15 13:34:20.092775] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:32.528 [2024-07-15 13:34:20.092810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:32.528 [2024-07-15 13:34:20.093798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.528 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:32.795 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.795 "name": "Existed_Raid", 00:11:32.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.795 "strip_size_kb": 64, 00:11:32.795 "state": "configuring", 00:11:32.795 "raid_level": "raid0", 00:11:32.795 "superblock": false, 00:11:32.795 "num_base_bdevs": 3, 00:11:32.795 "num_base_bdevs_discovered": 2, 00:11:32.795 "num_base_bdevs_operational": 3, 00:11:32.795 "base_bdevs_list": [ 00:11:32.795 { 00:11:32.795 "name": "BaseBdev1", 00:11:32.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.795 "is_configured": false, 00:11:32.795 "data_offset": 0, 00:11:32.795 "data_size": 0 00:11:32.795 }, 00:11:32.795 { 00:11:32.795 "name": "BaseBdev2", 00:11:32.795 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:32.795 "is_configured": true, 00:11:32.795 "data_offset": 0, 00:11:32.795 "data_size": 65536 00:11:32.795 }, 00:11:32.795 { 00:11:32.795 "name": "BaseBdev3", 00:11:32.795 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:32.795 "is_configured": true, 00:11:32.795 "data_offset": 0, 00:11:32.795 "data_size": 65536 00:11:32.795 } 00:11:32.795 ] 00:11:32.795 }' 00:11:32.795 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.795 13:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:33.401 [2024-07-15 13:34:20.958965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.401 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.658 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.658 "name": "Existed_Raid", 00:11:33.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.658 "strip_size_kb": 64, 00:11:33.658 "state": "configuring", 00:11:33.658 "raid_level": "raid0", 00:11:33.658 "superblock": false, 00:11:33.658 "num_base_bdevs": 3, 00:11:33.658 "num_base_bdevs_discovered": 1, 00:11:33.658 "num_base_bdevs_operational": 3, 00:11:33.658 "base_bdevs_list": [ 00:11:33.658 { 00:11:33.658 "name": "BaseBdev1", 00:11:33.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.658 "is_configured": false, 00:11:33.658 "data_offset": 0, 00:11:33.658 "data_size": 0 00:11:33.658 }, 00:11:33.658 { 00:11:33.658 "name": null, 00:11:33.658 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:33.658 "is_configured": false, 00:11:33.658 "data_offset": 0, 00:11:33.658 "data_size": 65536 00:11:33.658 }, 00:11:33.658 { 00:11:33.658 "name": "BaseBdev3", 00:11:33.658 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:33.658 "is_configured": true, 00:11:33.658 "data_offset": 0, 00:11:33.658 "data_size": 65536 00:11:33.658 } 00:11:33.658 ] 00:11:33.658 }' 00:11:33.658 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.658 13:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.223 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.223 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:34.223 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:34.223 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:34.481 [2024-07-15 13:34:21.969638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:34.481 BaseBdev1 00:11:34.481 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:34.481 13:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:34.481 13:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:34.481 13:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:34.481 13:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:34.481 13:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:34.481 13:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:34.738 [ 00:11:34.738 { 00:11:34.738 "name": "BaseBdev1", 00:11:34.738 "aliases": [ 00:11:34.738 "b831ac31-ca21-42be-b5c0-34a4b2b2bc33" 00:11:34.738 ], 00:11:34.738 "product_name": "Malloc disk", 00:11:34.738 "block_size": 512, 00:11:34.738 "num_blocks": 65536, 00:11:34.738 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:34.738 "assigned_rate_limits": { 00:11:34.738 "rw_ios_per_sec": 0, 00:11:34.738 "rw_mbytes_per_sec": 0, 00:11:34.738 "r_mbytes_per_sec": 0, 00:11:34.738 "w_mbytes_per_sec": 0 00:11:34.738 }, 00:11:34.738 "claimed": true, 00:11:34.738 "claim_type": "exclusive_write", 00:11:34.738 "zoned": false, 00:11:34.738 "supported_io_types": { 00:11:34.738 "read": true, 00:11:34.738 "write": true, 00:11:34.738 "unmap": true, 00:11:34.738 "flush": true, 00:11:34.738 "reset": true, 00:11:34.738 "nvme_admin": false, 00:11:34.738 "nvme_io": false, 00:11:34.738 "nvme_io_md": false, 00:11:34.738 "write_zeroes": true, 00:11:34.738 "zcopy": true, 00:11:34.738 "get_zone_info": false, 00:11:34.738 "zone_management": false, 00:11:34.738 "zone_append": false, 00:11:34.738 "compare": false, 00:11:34.738 "compare_and_write": false, 00:11:34.738 "abort": true, 00:11:34.738 "seek_hole": false, 00:11:34.738 "seek_data": false, 00:11:34.738 "copy": true, 00:11:34.738 "nvme_iov_md": false 00:11:34.738 }, 00:11:34.738 "memory_domains": [ 00:11:34.738 { 00:11:34.738 "dma_device_id": "system", 00:11:34.738 "dma_device_type": 1 00:11:34.738 }, 00:11:34.738 { 00:11:34.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:34.738 "dma_device_type": 2 00:11:34.738 } 00:11:34.738 ], 00:11:34.738 "driver_specific": {} 00:11:34.738 } 00:11:34.738 ] 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.738 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.996 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.996 "name": "Existed_Raid", 00:11:34.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.996 "strip_size_kb": 64, 00:11:34.996 "state": "configuring", 00:11:34.996 "raid_level": "raid0", 00:11:34.996 "superblock": false, 00:11:34.996 "num_base_bdevs": 3, 00:11:34.996 "num_base_bdevs_discovered": 2, 00:11:34.996 "num_base_bdevs_operational": 3, 00:11:34.996 "base_bdevs_list": [ 00:11:34.996 { 00:11:34.996 "name": "BaseBdev1", 00:11:34.996 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:34.996 "is_configured": true, 00:11:34.996 "data_offset": 0, 00:11:34.996 "data_size": 65536 00:11:34.996 }, 00:11:34.996 { 00:11:34.996 "name": null, 00:11:34.996 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:34.996 "is_configured": false, 00:11:34.996 "data_offset": 0, 00:11:34.996 "data_size": 65536 00:11:34.996 }, 00:11:34.996 { 00:11:34.996 "name": "BaseBdev3", 00:11:34.996 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:34.996 "is_configured": true, 00:11:34.996 "data_offset": 0, 00:11:34.996 "data_size": 65536 00:11:34.996 } 00:11:34.996 ] 00:11:34.996 }' 00:11:34.996 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.996 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.559 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.559 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:35.559 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:35.559 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:35.816 [2024-07-15 13:34:23.289065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.816 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:36.073 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.073 "name": "Existed_Raid", 00:11:36.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:36.073 "strip_size_kb": 64, 00:11:36.073 "state": "configuring", 00:11:36.073 "raid_level": "raid0", 00:11:36.073 "superblock": false, 00:11:36.073 "num_base_bdevs": 3, 00:11:36.073 "num_base_bdevs_discovered": 1, 00:11:36.073 "num_base_bdevs_operational": 3, 00:11:36.073 "base_bdevs_list": [ 00:11:36.073 { 00:11:36.073 "name": "BaseBdev1", 00:11:36.073 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:36.073 "is_configured": true, 00:11:36.073 "data_offset": 0, 00:11:36.073 "data_size": 65536 00:11:36.073 }, 00:11:36.073 { 00:11:36.073 "name": null, 00:11:36.073 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:36.073 "is_configured": false, 00:11:36.073 "data_offset": 0, 00:11:36.073 "data_size": 65536 00:11:36.073 }, 00:11:36.073 { 00:11:36.073 "name": null, 00:11:36.073 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:36.073 "is_configured": false, 00:11:36.073 "data_offset": 0, 00:11:36.073 "data_size": 65536 00:11:36.073 } 00:11:36.073 ] 00:11:36.073 }' 00:11:36.073 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.073 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.335 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.335 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:36.595 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:36.595 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:36.851 [2024-07-15 13:34:24.275646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.851 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:36.852 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.852 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.852 "name": "Existed_Raid", 00:11:36.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:36.852 "strip_size_kb": 64, 00:11:36.852 "state": "configuring", 00:11:36.852 "raid_level": "raid0", 00:11:36.852 "superblock": false, 00:11:36.852 "num_base_bdevs": 3, 00:11:36.852 "num_base_bdevs_discovered": 2, 00:11:36.852 "num_base_bdevs_operational": 3, 00:11:36.852 "base_bdevs_list": [ 00:11:36.852 { 00:11:36.852 "name": "BaseBdev1", 00:11:36.852 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:36.852 "is_configured": true, 00:11:36.852 "data_offset": 0, 00:11:36.852 "data_size": 65536 00:11:36.852 }, 00:11:36.852 { 00:11:36.852 "name": null, 00:11:36.852 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:36.852 "is_configured": false, 00:11:36.852 "data_offset": 0, 00:11:36.852 "data_size": 65536 00:11:36.852 }, 00:11:36.852 { 00:11:36.852 "name": "BaseBdev3", 00:11:36.852 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:36.852 "is_configured": true, 00:11:36.852 "data_offset": 0, 00:11:36.852 "data_size": 65536 00:11:36.852 } 00:11:36.852 ] 00:11:36.852 }' 00:11:36.852 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.852 13:34:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.415 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.415 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:37.672 [2024-07-15 13:34:25.262210] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.672 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.929 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.929 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.929 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.929 "name": "Existed_Raid", 00:11:37.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.929 "strip_size_kb": 64, 00:11:37.929 "state": "configuring", 00:11:37.929 "raid_level": "raid0", 00:11:37.929 "superblock": false, 00:11:37.929 "num_base_bdevs": 3, 00:11:37.929 "num_base_bdevs_discovered": 1, 00:11:37.929 "num_base_bdevs_operational": 3, 00:11:37.929 "base_bdevs_list": [ 00:11:37.929 { 00:11:37.929 "name": null, 00:11:37.929 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:37.929 "is_configured": false, 00:11:37.929 "data_offset": 0, 00:11:37.929 "data_size": 65536 00:11:37.929 }, 00:11:37.929 { 00:11:37.929 "name": null, 00:11:37.929 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:37.929 "is_configured": false, 00:11:37.929 "data_offset": 0, 00:11:37.929 "data_size": 65536 00:11:37.929 }, 00:11:37.929 { 00:11:37.929 "name": "BaseBdev3", 00:11:37.929 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:37.929 "is_configured": true, 00:11:37.929 "data_offset": 0, 00:11:37.929 "data_size": 65536 00:11:37.929 } 00:11:37.929 ] 00:11:37.929 }' 00:11:37.929 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.929 13:34:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.493 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.493 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:38.750 [2024-07-15 13:34:26.303324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:38.750 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.007 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.007 "name": "Existed_Raid", 00:11:39.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.007 "strip_size_kb": 64, 00:11:39.007 "state": "configuring", 00:11:39.007 "raid_level": "raid0", 00:11:39.007 "superblock": false, 00:11:39.007 "num_base_bdevs": 3, 00:11:39.007 "num_base_bdevs_discovered": 2, 00:11:39.007 "num_base_bdevs_operational": 3, 00:11:39.007 "base_bdevs_list": [ 00:11:39.007 { 00:11:39.007 "name": null, 00:11:39.007 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:39.007 "is_configured": false, 00:11:39.007 "data_offset": 0, 00:11:39.007 "data_size": 65536 00:11:39.007 }, 00:11:39.007 { 00:11:39.007 "name": "BaseBdev2", 00:11:39.007 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:39.007 "is_configured": true, 00:11:39.007 "data_offset": 0, 00:11:39.007 "data_size": 65536 00:11:39.007 }, 00:11:39.007 { 00:11:39.007 "name": "BaseBdev3", 00:11:39.007 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:39.007 "is_configured": true, 00:11:39.007 "data_offset": 0, 00:11:39.007 "data_size": 65536 00:11:39.007 } 00:11:39.007 ] 00:11:39.007 }' 00:11:39.007 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.007 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.573 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.573 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:39.573 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:39.573 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.573 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:39.832 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b831ac31-ca21-42be-b5c0-34a4b2b2bc33 00:11:40.090 [2024-07-15 13:34:27.506465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:40.090 [2024-07-15 13:34:27.506500] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2758000 00:11:40.090 [2024-07-15 13:34:27.506512] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:40.090 [2024-07-15 13:34:27.506656] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2909a10 00:11:40.090 [2024-07-15 13:34:27.506749] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2758000 00:11:40.090 [2024-07-15 13:34:27.506756] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2758000 00:11:40.090 [2024-07-15 13:34:27.506880] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:40.090 NewBaseBdev 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:40.090 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:40.349 [ 00:11:40.349 { 00:11:40.349 "name": "NewBaseBdev", 00:11:40.349 "aliases": [ 00:11:40.349 "b831ac31-ca21-42be-b5c0-34a4b2b2bc33" 00:11:40.349 ], 00:11:40.349 "product_name": "Malloc disk", 00:11:40.349 "block_size": 512, 00:11:40.349 "num_blocks": 65536, 00:11:40.349 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:40.349 "assigned_rate_limits": { 00:11:40.349 "rw_ios_per_sec": 0, 00:11:40.349 "rw_mbytes_per_sec": 0, 00:11:40.349 "r_mbytes_per_sec": 0, 00:11:40.349 "w_mbytes_per_sec": 0 00:11:40.349 }, 00:11:40.349 "claimed": true, 00:11:40.349 "claim_type": "exclusive_write", 00:11:40.349 "zoned": false, 00:11:40.349 "supported_io_types": { 00:11:40.349 "read": true, 00:11:40.349 "write": true, 00:11:40.349 "unmap": true, 00:11:40.349 "flush": true, 00:11:40.349 "reset": true, 00:11:40.349 "nvme_admin": false, 00:11:40.349 "nvme_io": false, 00:11:40.349 "nvme_io_md": false, 00:11:40.349 "write_zeroes": true, 00:11:40.349 "zcopy": true, 00:11:40.349 "get_zone_info": false, 00:11:40.349 "zone_management": false, 00:11:40.349 "zone_append": false, 00:11:40.349 "compare": false, 00:11:40.349 "compare_and_write": false, 00:11:40.349 "abort": true, 00:11:40.349 "seek_hole": false, 00:11:40.349 "seek_data": false, 00:11:40.349 "copy": true, 00:11:40.349 "nvme_iov_md": false 00:11:40.349 }, 00:11:40.349 "memory_domains": [ 00:11:40.349 { 00:11:40.349 "dma_device_id": "system", 00:11:40.349 "dma_device_type": 1 00:11:40.349 }, 00:11:40.349 { 00:11:40.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.349 "dma_device_type": 2 00:11:40.349 } 00:11:40.349 ], 00:11:40.349 "driver_specific": {} 00:11:40.349 } 00:11:40.349 ] 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.349 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:40.607 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:40.607 "name": "Existed_Raid", 00:11:40.607 "uuid": "5fc5f033-c812-404b-bb34-953f56334889", 00:11:40.607 "strip_size_kb": 64, 00:11:40.607 "state": "online", 00:11:40.607 "raid_level": "raid0", 00:11:40.607 "superblock": false, 00:11:40.607 "num_base_bdevs": 3, 00:11:40.607 "num_base_bdevs_discovered": 3, 00:11:40.607 "num_base_bdevs_operational": 3, 00:11:40.607 "base_bdevs_list": [ 00:11:40.607 { 00:11:40.607 "name": "NewBaseBdev", 00:11:40.607 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:40.607 "is_configured": true, 00:11:40.607 "data_offset": 0, 00:11:40.607 "data_size": 65536 00:11:40.607 }, 00:11:40.607 { 00:11:40.607 "name": "BaseBdev2", 00:11:40.607 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:40.607 "is_configured": true, 00:11:40.607 "data_offset": 0, 00:11:40.607 "data_size": 65536 00:11:40.607 }, 00:11:40.607 { 00:11:40.607 "name": "BaseBdev3", 00:11:40.607 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:40.607 "is_configured": true, 00:11:40.607 "data_offset": 0, 00:11:40.607 "data_size": 65536 00:11:40.607 } 00:11:40.607 ] 00:11:40.607 }' 00:11:40.607 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:40.607 13:34:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:41.172 [2024-07-15 13:34:28.653636] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:41.172 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:41.172 "name": "Existed_Raid", 00:11:41.172 "aliases": [ 00:11:41.172 "5fc5f033-c812-404b-bb34-953f56334889" 00:11:41.172 ], 00:11:41.172 "product_name": "Raid Volume", 00:11:41.172 "block_size": 512, 00:11:41.172 "num_blocks": 196608, 00:11:41.172 "uuid": "5fc5f033-c812-404b-bb34-953f56334889", 00:11:41.172 "assigned_rate_limits": { 00:11:41.172 "rw_ios_per_sec": 0, 00:11:41.172 "rw_mbytes_per_sec": 0, 00:11:41.172 "r_mbytes_per_sec": 0, 00:11:41.172 "w_mbytes_per_sec": 0 00:11:41.172 }, 00:11:41.172 "claimed": false, 00:11:41.172 "zoned": false, 00:11:41.172 "supported_io_types": { 00:11:41.172 "read": true, 00:11:41.172 "write": true, 00:11:41.172 "unmap": true, 00:11:41.172 "flush": true, 00:11:41.172 "reset": true, 00:11:41.172 "nvme_admin": false, 00:11:41.172 "nvme_io": false, 00:11:41.172 "nvme_io_md": false, 00:11:41.172 "write_zeroes": true, 00:11:41.172 "zcopy": false, 00:11:41.172 "get_zone_info": false, 00:11:41.172 "zone_management": false, 00:11:41.172 "zone_append": false, 00:11:41.172 "compare": false, 00:11:41.172 "compare_and_write": false, 00:11:41.172 "abort": false, 00:11:41.172 "seek_hole": false, 00:11:41.172 "seek_data": false, 00:11:41.172 "copy": false, 00:11:41.172 "nvme_iov_md": false 00:11:41.172 }, 00:11:41.172 "memory_domains": [ 00:11:41.172 { 00:11:41.172 "dma_device_id": "system", 00:11:41.172 "dma_device_type": 1 00:11:41.172 }, 00:11:41.172 { 00:11:41.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.172 "dma_device_type": 2 00:11:41.172 }, 00:11:41.172 { 00:11:41.172 "dma_device_id": "system", 00:11:41.172 "dma_device_type": 1 00:11:41.172 }, 00:11:41.172 { 00:11:41.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.172 "dma_device_type": 2 00:11:41.172 }, 00:11:41.172 { 00:11:41.172 "dma_device_id": "system", 00:11:41.172 "dma_device_type": 1 00:11:41.172 }, 00:11:41.172 { 00:11:41.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.172 "dma_device_type": 2 00:11:41.172 } 00:11:41.172 ], 00:11:41.172 "driver_specific": { 00:11:41.172 "raid": { 00:11:41.172 "uuid": "5fc5f033-c812-404b-bb34-953f56334889", 00:11:41.172 "strip_size_kb": 64, 00:11:41.172 "state": "online", 00:11:41.172 "raid_level": "raid0", 00:11:41.172 "superblock": false, 00:11:41.173 "num_base_bdevs": 3, 00:11:41.173 "num_base_bdevs_discovered": 3, 00:11:41.173 "num_base_bdevs_operational": 3, 00:11:41.173 "base_bdevs_list": [ 00:11:41.173 { 00:11:41.173 "name": "NewBaseBdev", 00:11:41.173 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:41.173 "is_configured": true, 00:11:41.173 "data_offset": 0, 00:11:41.173 "data_size": 65536 00:11:41.173 }, 00:11:41.173 { 00:11:41.173 "name": "BaseBdev2", 00:11:41.173 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:41.173 "is_configured": true, 00:11:41.173 "data_offset": 0, 00:11:41.173 "data_size": 65536 00:11:41.173 }, 00:11:41.173 { 00:11:41.173 "name": "BaseBdev3", 00:11:41.173 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:41.173 "is_configured": true, 00:11:41.173 "data_offset": 0, 00:11:41.173 "data_size": 65536 00:11:41.173 } 00:11:41.173 ] 00:11:41.173 } 00:11:41.173 } 00:11:41.173 }' 00:11:41.173 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:41.173 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:41.173 BaseBdev2 00:11:41.173 BaseBdev3' 00:11:41.173 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:41.173 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:41.173 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:41.430 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:41.430 "name": "NewBaseBdev", 00:11:41.430 "aliases": [ 00:11:41.430 "b831ac31-ca21-42be-b5c0-34a4b2b2bc33" 00:11:41.430 ], 00:11:41.430 "product_name": "Malloc disk", 00:11:41.430 "block_size": 512, 00:11:41.430 "num_blocks": 65536, 00:11:41.430 "uuid": "b831ac31-ca21-42be-b5c0-34a4b2b2bc33", 00:11:41.430 "assigned_rate_limits": { 00:11:41.430 "rw_ios_per_sec": 0, 00:11:41.430 "rw_mbytes_per_sec": 0, 00:11:41.430 "r_mbytes_per_sec": 0, 00:11:41.430 "w_mbytes_per_sec": 0 00:11:41.430 }, 00:11:41.430 "claimed": true, 00:11:41.430 "claim_type": "exclusive_write", 00:11:41.430 "zoned": false, 00:11:41.430 "supported_io_types": { 00:11:41.430 "read": true, 00:11:41.430 "write": true, 00:11:41.430 "unmap": true, 00:11:41.430 "flush": true, 00:11:41.430 "reset": true, 00:11:41.430 "nvme_admin": false, 00:11:41.430 "nvme_io": false, 00:11:41.430 "nvme_io_md": false, 00:11:41.430 "write_zeroes": true, 00:11:41.430 "zcopy": true, 00:11:41.430 "get_zone_info": false, 00:11:41.430 "zone_management": false, 00:11:41.430 "zone_append": false, 00:11:41.430 "compare": false, 00:11:41.430 "compare_and_write": false, 00:11:41.430 "abort": true, 00:11:41.430 "seek_hole": false, 00:11:41.430 "seek_data": false, 00:11:41.430 "copy": true, 00:11:41.430 "nvme_iov_md": false 00:11:41.430 }, 00:11:41.430 "memory_domains": [ 00:11:41.430 { 00:11:41.430 "dma_device_id": "system", 00:11:41.430 "dma_device_type": 1 00:11:41.430 }, 00:11:41.430 { 00:11:41.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.430 "dma_device_type": 2 00:11:41.430 } 00:11:41.430 ], 00:11:41.430 "driver_specific": {} 00:11:41.430 }' 00:11:41.430 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.430 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.430 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:41.430 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.430 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.430 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:41.430 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:41.688 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:41.946 "name": "BaseBdev2", 00:11:41.946 "aliases": [ 00:11:41.946 "b0e67b51-c713-41ac-856f-3b0ba5630097" 00:11:41.946 ], 00:11:41.946 "product_name": "Malloc disk", 00:11:41.946 "block_size": 512, 00:11:41.946 "num_blocks": 65536, 00:11:41.946 "uuid": "b0e67b51-c713-41ac-856f-3b0ba5630097", 00:11:41.946 "assigned_rate_limits": { 00:11:41.946 "rw_ios_per_sec": 0, 00:11:41.946 "rw_mbytes_per_sec": 0, 00:11:41.946 "r_mbytes_per_sec": 0, 00:11:41.946 "w_mbytes_per_sec": 0 00:11:41.946 }, 00:11:41.946 "claimed": true, 00:11:41.946 "claim_type": "exclusive_write", 00:11:41.946 "zoned": false, 00:11:41.946 "supported_io_types": { 00:11:41.946 "read": true, 00:11:41.946 "write": true, 00:11:41.946 "unmap": true, 00:11:41.946 "flush": true, 00:11:41.946 "reset": true, 00:11:41.946 "nvme_admin": false, 00:11:41.946 "nvme_io": false, 00:11:41.946 "nvme_io_md": false, 00:11:41.946 "write_zeroes": true, 00:11:41.946 "zcopy": true, 00:11:41.946 "get_zone_info": false, 00:11:41.946 "zone_management": false, 00:11:41.946 "zone_append": false, 00:11:41.946 "compare": false, 00:11:41.946 "compare_and_write": false, 00:11:41.946 "abort": true, 00:11:41.946 "seek_hole": false, 00:11:41.946 "seek_data": false, 00:11:41.946 "copy": true, 00:11:41.946 "nvme_iov_md": false 00:11:41.946 }, 00:11:41.946 "memory_domains": [ 00:11:41.946 { 00:11:41.946 "dma_device_id": "system", 00:11:41.946 "dma_device_type": 1 00:11:41.946 }, 00:11:41.946 { 00:11:41.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.946 "dma_device_type": 2 00:11:41.946 } 00:11:41.946 ], 00:11:41.946 "driver_specific": {} 00:11:41.946 }' 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.946 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.233 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:42.233 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.233 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.233 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:42.233 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:42.233 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:42.233 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:42.489 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:42.489 "name": "BaseBdev3", 00:11:42.489 "aliases": [ 00:11:42.489 "4a07c324-fc98-4bcd-92d0-7654c35b5ddd" 00:11:42.489 ], 00:11:42.489 "product_name": "Malloc disk", 00:11:42.489 "block_size": 512, 00:11:42.489 "num_blocks": 65536, 00:11:42.489 "uuid": "4a07c324-fc98-4bcd-92d0-7654c35b5ddd", 00:11:42.489 "assigned_rate_limits": { 00:11:42.489 "rw_ios_per_sec": 0, 00:11:42.489 "rw_mbytes_per_sec": 0, 00:11:42.489 "r_mbytes_per_sec": 0, 00:11:42.489 "w_mbytes_per_sec": 0 00:11:42.489 }, 00:11:42.489 "claimed": true, 00:11:42.489 "claim_type": "exclusive_write", 00:11:42.489 "zoned": false, 00:11:42.489 "supported_io_types": { 00:11:42.489 "read": true, 00:11:42.489 "write": true, 00:11:42.489 "unmap": true, 00:11:42.489 "flush": true, 00:11:42.489 "reset": true, 00:11:42.489 "nvme_admin": false, 00:11:42.489 "nvme_io": false, 00:11:42.489 "nvme_io_md": false, 00:11:42.489 "write_zeroes": true, 00:11:42.490 "zcopy": true, 00:11:42.490 "get_zone_info": false, 00:11:42.490 "zone_management": false, 00:11:42.490 "zone_append": false, 00:11:42.490 "compare": false, 00:11:42.490 "compare_and_write": false, 00:11:42.490 "abort": true, 00:11:42.490 "seek_hole": false, 00:11:42.490 "seek_data": false, 00:11:42.490 "copy": true, 00:11:42.490 "nvme_iov_md": false 00:11:42.490 }, 00:11:42.490 "memory_domains": [ 00:11:42.490 { 00:11:42.490 "dma_device_id": "system", 00:11:42.490 "dma_device_type": 1 00:11:42.490 }, 00:11:42.490 { 00:11:42.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.490 "dma_device_type": 2 00:11:42.490 } 00:11:42.490 ], 00:11:42.490 "driver_specific": {} 00:11:42.490 }' 00:11:42.490 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.490 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.490 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:42.490 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.490 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.490 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:42.490 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.490 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.490 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:42.490 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:42.746 [2024-07-15 13:34:30.297715] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:42.746 [2024-07-15 13:34:30.297741] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:42.746 [2024-07-15 13:34:30.297786] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:42.746 [2024-07-15 13:34:30.297822] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:42.746 [2024-07-15 13:34:30.297830] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2758000 name Existed_Raid, state offline 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4182262 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4182262 ']' 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4182262 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:42.746 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4182262 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4182262' 00:11:43.003 killing process with pid 4182262 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4182262 00:11:43.003 [2024-07-15 13:34:30.366267] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4182262 00:11:43.003 [2024-07-15 13:34:30.392070] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:43.003 00:11:43.003 real 0m21.754s 00:11:43.003 user 0m39.677s 00:11:43.003 sys 0m4.163s 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:43.003 13:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.003 ************************************ 00:11:43.003 END TEST raid_state_function_test 00:11:43.003 ************************************ 00:11:43.003 13:34:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:43.003 13:34:30 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:11:43.003 13:34:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:43.003 13:34:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:43.003 13:34:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:43.261 ************************************ 00:11:43.261 START TEST raid_state_function_test_sb 00:11:43.261 ************************************ 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4185705 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4185705' 00:11:43.261 Process raid pid: 4185705 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4185705 /var/tmp/spdk-raid.sock 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4185705 ']' 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:43.261 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:43.261 13:34:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:43.261 [2024-07-15 13:34:30.713679] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:11:43.261 [2024-07-15 13:34:30.713727] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:43.261 [2024-07-15 13:34:30.801473] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:43.519 [2024-07-15 13:34:30.892046] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:43.519 [2024-07-15 13:34:30.951095] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:43.519 [2024-07-15 13:34:30.951120] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:44.085 13:34:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:44.085 13:34:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:44.085 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:44.085 [2024-07-15 13:34:31.659058] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:44.085 [2024-07-15 13:34:31.659089] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:44.085 [2024-07-15 13:34:31.659096] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:44.085 [2024-07-15 13:34:31.659119] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:44.085 [2024-07-15 13:34:31.659125] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:44.086 [2024-07-15 13:34:31.659132] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.086 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.343 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.343 "name": "Existed_Raid", 00:11:44.343 "uuid": "9e596d0e-0a38-48be-aadb-cb76e28b69e8", 00:11:44.343 "strip_size_kb": 64, 00:11:44.343 "state": "configuring", 00:11:44.343 "raid_level": "raid0", 00:11:44.343 "superblock": true, 00:11:44.343 "num_base_bdevs": 3, 00:11:44.343 "num_base_bdevs_discovered": 0, 00:11:44.343 "num_base_bdevs_operational": 3, 00:11:44.343 "base_bdevs_list": [ 00:11:44.343 { 00:11:44.343 "name": "BaseBdev1", 00:11:44.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.343 "is_configured": false, 00:11:44.343 "data_offset": 0, 00:11:44.343 "data_size": 0 00:11:44.343 }, 00:11:44.343 { 00:11:44.343 "name": "BaseBdev2", 00:11:44.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.343 "is_configured": false, 00:11:44.343 "data_offset": 0, 00:11:44.343 "data_size": 0 00:11:44.343 }, 00:11:44.343 { 00:11:44.343 "name": "BaseBdev3", 00:11:44.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.343 "is_configured": false, 00:11:44.343 "data_offset": 0, 00:11:44.343 "data_size": 0 00:11:44.343 } 00:11:44.343 ] 00:11:44.343 }' 00:11:44.343 13:34:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.343 13:34:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:44.908 13:34:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:44.908 [2024-07-15 13:34:32.489122] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:44.908 [2024-07-15 13:34:32.489148] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x177ef50 name Existed_Raid, state configuring 00:11:44.908 13:34:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:45.166 [2024-07-15 13:34:32.665594] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:45.166 [2024-07-15 13:34:32.665618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:45.166 [2024-07-15 13:34:32.665625] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:45.166 [2024-07-15 13:34:32.665633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:45.166 [2024-07-15 13:34:32.665639] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:45.166 [2024-07-15 13:34:32.665646] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:45.166 13:34:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:45.423 [2024-07-15 13:34:32.846547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:45.424 BaseBdev1 00:11:45.424 13:34:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:45.424 13:34:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:45.424 13:34:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:45.424 13:34:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:45.424 13:34:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:45.424 13:34:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:45.424 13:34:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:45.424 13:34:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:45.682 [ 00:11:45.682 { 00:11:45.682 "name": "BaseBdev1", 00:11:45.682 "aliases": [ 00:11:45.682 "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5" 00:11:45.682 ], 00:11:45.682 "product_name": "Malloc disk", 00:11:45.682 "block_size": 512, 00:11:45.682 "num_blocks": 65536, 00:11:45.682 "uuid": "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5", 00:11:45.682 "assigned_rate_limits": { 00:11:45.682 "rw_ios_per_sec": 0, 00:11:45.682 "rw_mbytes_per_sec": 0, 00:11:45.682 "r_mbytes_per_sec": 0, 00:11:45.682 "w_mbytes_per_sec": 0 00:11:45.682 }, 00:11:45.682 "claimed": true, 00:11:45.682 "claim_type": "exclusive_write", 00:11:45.682 "zoned": false, 00:11:45.682 "supported_io_types": { 00:11:45.682 "read": true, 00:11:45.682 "write": true, 00:11:45.682 "unmap": true, 00:11:45.682 "flush": true, 00:11:45.682 "reset": true, 00:11:45.682 "nvme_admin": false, 00:11:45.682 "nvme_io": false, 00:11:45.682 "nvme_io_md": false, 00:11:45.682 "write_zeroes": true, 00:11:45.682 "zcopy": true, 00:11:45.682 "get_zone_info": false, 00:11:45.682 "zone_management": false, 00:11:45.682 "zone_append": false, 00:11:45.682 "compare": false, 00:11:45.682 "compare_and_write": false, 00:11:45.682 "abort": true, 00:11:45.682 "seek_hole": false, 00:11:45.682 "seek_data": false, 00:11:45.682 "copy": true, 00:11:45.682 "nvme_iov_md": false 00:11:45.682 }, 00:11:45.682 "memory_domains": [ 00:11:45.682 { 00:11:45.682 "dma_device_id": "system", 00:11:45.682 "dma_device_type": 1 00:11:45.682 }, 00:11:45.682 { 00:11:45.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.682 "dma_device_type": 2 00:11:45.682 } 00:11:45.682 ], 00:11:45.682 "driver_specific": {} 00:11:45.682 } 00:11:45.682 ] 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.682 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.939 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.939 "name": "Existed_Raid", 00:11:45.939 "uuid": "fa2374f0-f309-43ae-bfab-151744bcef2b", 00:11:45.939 "strip_size_kb": 64, 00:11:45.939 "state": "configuring", 00:11:45.939 "raid_level": "raid0", 00:11:45.939 "superblock": true, 00:11:45.939 "num_base_bdevs": 3, 00:11:45.939 "num_base_bdevs_discovered": 1, 00:11:45.939 "num_base_bdevs_operational": 3, 00:11:45.939 "base_bdevs_list": [ 00:11:45.939 { 00:11:45.939 "name": "BaseBdev1", 00:11:45.939 "uuid": "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5", 00:11:45.939 "is_configured": true, 00:11:45.939 "data_offset": 2048, 00:11:45.939 "data_size": 63488 00:11:45.939 }, 00:11:45.939 { 00:11:45.939 "name": "BaseBdev2", 00:11:45.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.939 "is_configured": false, 00:11:45.939 "data_offset": 0, 00:11:45.939 "data_size": 0 00:11:45.939 }, 00:11:45.939 { 00:11:45.939 "name": "BaseBdev3", 00:11:45.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.939 "is_configured": false, 00:11:45.939 "data_offset": 0, 00:11:45.939 "data_size": 0 00:11:45.939 } 00:11:45.939 ] 00:11:45.939 }' 00:11:45.939 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.939 13:34:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:46.502 13:34:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:46.502 [2024-07-15 13:34:34.021597] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:46.503 [2024-07-15 13:34:34.021631] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x177e820 name Existed_Raid, state configuring 00:11:46.503 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:46.760 [2024-07-15 13:34:34.198092] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:46.760 [2024-07-15 13:34:34.199177] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:46.760 [2024-07-15 13:34:34.199202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:46.760 [2024-07-15 13:34:34.199209] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:46.760 [2024-07-15 13:34:34.199216] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.760 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.018 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.018 "name": "Existed_Raid", 00:11:47.018 "uuid": "8a2ff595-b1d7-4d41-a6d2-392a7416f06a", 00:11:47.018 "strip_size_kb": 64, 00:11:47.018 "state": "configuring", 00:11:47.018 "raid_level": "raid0", 00:11:47.018 "superblock": true, 00:11:47.018 "num_base_bdevs": 3, 00:11:47.018 "num_base_bdevs_discovered": 1, 00:11:47.018 "num_base_bdevs_operational": 3, 00:11:47.018 "base_bdevs_list": [ 00:11:47.018 { 00:11:47.018 "name": "BaseBdev1", 00:11:47.018 "uuid": "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5", 00:11:47.018 "is_configured": true, 00:11:47.018 "data_offset": 2048, 00:11:47.018 "data_size": 63488 00:11:47.018 }, 00:11:47.018 { 00:11:47.018 "name": "BaseBdev2", 00:11:47.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.018 "is_configured": false, 00:11:47.018 "data_offset": 0, 00:11:47.018 "data_size": 0 00:11:47.018 }, 00:11:47.018 { 00:11:47.018 "name": "BaseBdev3", 00:11:47.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.018 "is_configured": false, 00:11:47.018 "data_offset": 0, 00:11:47.018 "data_size": 0 00:11:47.018 } 00:11:47.018 ] 00:11:47.018 }' 00:11:47.018 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.018 13:34:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:47.275 13:34:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:47.533 [2024-07-15 13:34:35.047268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:47.533 BaseBdev2 00:11:47.533 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:47.533 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:47.533 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:47.533 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:47.533 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:47.533 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:47.533 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:47.790 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:47.790 [ 00:11:47.790 { 00:11:47.790 "name": "BaseBdev2", 00:11:47.790 "aliases": [ 00:11:47.790 "a0bd2aeb-4ff7-4895-9529-358aeacdba3b" 00:11:47.790 ], 00:11:47.790 "product_name": "Malloc disk", 00:11:47.790 "block_size": 512, 00:11:47.790 "num_blocks": 65536, 00:11:47.790 "uuid": "a0bd2aeb-4ff7-4895-9529-358aeacdba3b", 00:11:47.790 "assigned_rate_limits": { 00:11:47.790 "rw_ios_per_sec": 0, 00:11:47.790 "rw_mbytes_per_sec": 0, 00:11:47.790 "r_mbytes_per_sec": 0, 00:11:47.790 "w_mbytes_per_sec": 0 00:11:47.790 }, 00:11:47.790 "claimed": true, 00:11:47.790 "claim_type": "exclusive_write", 00:11:47.790 "zoned": false, 00:11:47.790 "supported_io_types": { 00:11:47.790 "read": true, 00:11:47.790 "write": true, 00:11:47.790 "unmap": true, 00:11:47.790 "flush": true, 00:11:47.790 "reset": true, 00:11:47.790 "nvme_admin": false, 00:11:47.790 "nvme_io": false, 00:11:47.790 "nvme_io_md": false, 00:11:47.790 "write_zeroes": true, 00:11:47.790 "zcopy": true, 00:11:47.790 "get_zone_info": false, 00:11:47.790 "zone_management": false, 00:11:47.790 "zone_append": false, 00:11:47.790 "compare": false, 00:11:47.790 "compare_and_write": false, 00:11:47.790 "abort": true, 00:11:47.790 "seek_hole": false, 00:11:47.791 "seek_data": false, 00:11:47.791 "copy": true, 00:11:47.791 "nvme_iov_md": false 00:11:47.791 }, 00:11:47.791 "memory_domains": [ 00:11:47.791 { 00:11:47.791 "dma_device_id": "system", 00:11:47.791 "dma_device_type": 1 00:11:47.791 }, 00:11:47.791 { 00:11:47.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.791 "dma_device_type": 2 00:11:47.791 } 00:11:47.791 ], 00:11:47.791 "driver_specific": {} 00:11:47.791 } 00:11:47.791 ] 00:11:47.791 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:47.791 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:47.791 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:47.791 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:47.791 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:47.791 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.048 "name": "Existed_Raid", 00:11:48.048 "uuid": "8a2ff595-b1d7-4d41-a6d2-392a7416f06a", 00:11:48.048 "strip_size_kb": 64, 00:11:48.048 "state": "configuring", 00:11:48.048 "raid_level": "raid0", 00:11:48.048 "superblock": true, 00:11:48.048 "num_base_bdevs": 3, 00:11:48.048 "num_base_bdevs_discovered": 2, 00:11:48.048 "num_base_bdevs_operational": 3, 00:11:48.048 "base_bdevs_list": [ 00:11:48.048 { 00:11:48.048 "name": "BaseBdev1", 00:11:48.048 "uuid": "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5", 00:11:48.048 "is_configured": true, 00:11:48.048 "data_offset": 2048, 00:11:48.048 "data_size": 63488 00:11:48.048 }, 00:11:48.048 { 00:11:48.048 "name": "BaseBdev2", 00:11:48.048 "uuid": "a0bd2aeb-4ff7-4895-9529-358aeacdba3b", 00:11:48.048 "is_configured": true, 00:11:48.048 "data_offset": 2048, 00:11:48.048 "data_size": 63488 00:11:48.048 }, 00:11:48.048 { 00:11:48.048 "name": "BaseBdev3", 00:11:48.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.048 "is_configured": false, 00:11:48.048 "data_offset": 0, 00:11:48.048 "data_size": 0 00:11:48.048 } 00:11:48.048 ] 00:11:48.048 }' 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.048 13:34:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:48.613 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:48.871 [2024-07-15 13:34:36.237256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:48.871 [2024-07-15 13:34:36.237399] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x177f710 00:11:48.871 [2024-07-15 13:34:36.237412] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:48.871 [2024-07-15 13:34:36.237539] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x177f3e0 00:11:48.871 [2024-07-15 13:34:36.237627] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x177f710 00:11:48.871 [2024-07-15 13:34:36.237634] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x177f710 00:11:48.871 [2024-07-15 13:34:36.237698] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:48.871 BaseBdev3 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:48.871 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:49.128 [ 00:11:49.128 { 00:11:49.128 "name": "BaseBdev3", 00:11:49.128 "aliases": [ 00:11:49.128 "714fceca-071f-4114-888b-9e296aec5a0c" 00:11:49.128 ], 00:11:49.128 "product_name": "Malloc disk", 00:11:49.128 "block_size": 512, 00:11:49.128 "num_blocks": 65536, 00:11:49.128 "uuid": "714fceca-071f-4114-888b-9e296aec5a0c", 00:11:49.128 "assigned_rate_limits": { 00:11:49.129 "rw_ios_per_sec": 0, 00:11:49.129 "rw_mbytes_per_sec": 0, 00:11:49.129 "r_mbytes_per_sec": 0, 00:11:49.129 "w_mbytes_per_sec": 0 00:11:49.129 }, 00:11:49.129 "claimed": true, 00:11:49.129 "claim_type": "exclusive_write", 00:11:49.129 "zoned": false, 00:11:49.129 "supported_io_types": { 00:11:49.129 "read": true, 00:11:49.129 "write": true, 00:11:49.129 "unmap": true, 00:11:49.129 "flush": true, 00:11:49.129 "reset": true, 00:11:49.129 "nvme_admin": false, 00:11:49.129 "nvme_io": false, 00:11:49.129 "nvme_io_md": false, 00:11:49.129 "write_zeroes": true, 00:11:49.129 "zcopy": true, 00:11:49.129 "get_zone_info": false, 00:11:49.129 "zone_management": false, 00:11:49.129 "zone_append": false, 00:11:49.129 "compare": false, 00:11:49.129 "compare_and_write": false, 00:11:49.129 "abort": true, 00:11:49.129 "seek_hole": false, 00:11:49.129 "seek_data": false, 00:11:49.129 "copy": true, 00:11:49.129 "nvme_iov_md": false 00:11:49.129 }, 00:11:49.129 "memory_domains": [ 00:11:49.129 { 00:11:49.129 "dma_device_id": "system", 00:11:49.129 "dma_device_type": 1 00:11:49.129 }, 00:11:49.129 { 00:11:49.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.129 "dma_device_type": 2 00:11:49.129 } 00:11:49.129 ], 00:11:49.129 "driver_specific": {} 00:11:49.129 } 00:11:49.129 ] 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.129 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.387 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.387 "name": "Existed_Raid", 00:11:49.387 "uuid": "8a2ff595-b1d7-4d41-a6d2-392a7416f06a", 00:11:49.387 "strip_size_kb": 64, 00:11:49.387 "state": "online", 00:11:49.387 "raid_level": "raid0", 00:11:49.387 "superblock": true, 00:11:49.387 "num_base_bdevs": 3, 00:11:49.387 "num_base_bdevs_discovered": 3, 00:11:49.387 "num_base_bdevs_operational": 3, 00:11:49.387 "base_bdevs_list": [ 00:11:49.387 { 00:11:49.387 "name": "BaseBdev1", 00:11:49.387 "uuid": "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5", 00:11:49.387 "is_configured": true, 00:11:49.387 "data_offset": 2048, 00:11:49.387 "data_size": 63488 00:11:49.387 }, 00:11:49.387 { 00:11:49.387 "name": "BaseBdev2", 00:11:49.387 "uuid": "a0bd2aeb-4ff7-4895-9529-358aeacdba3b", 00:11:49.387 "is_configured": true, 00:11:49.387 "data_offset": 2048, 00:11:49.387 "data_size": 63488 00:11:49.387 }, 00:11:49.387 { 00:11:49.387 "name": "BaseBdev3", 00:11:49.387 "uuid": "714fceca-071f-4114-888b-9e296aec5a0c", 00:11:49.387 "is_configured": true, 00:11:49.387 "data_offset": 2048, 00:11:49.387 "data_size": 63488 00:11:49.387 } 00:11:49.387 ] 00:11:49.387 }' 00:11:49.387 13:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.387 13:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:49.983 [2024-07-15 13:34:37.432565] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:49.983 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:49.983 "name": "Existed_Raid", 00:11:49.983 "aliases": [ 00:11:49.983 "8a2ff595-b1d7-4d41-a6d2-392a7416f06a" 00:11:49.983 ], 00:11:49.983 "product_name": "Raid Volume", 00:11:49.983 "block_size": 512, 00:11:49.983 "num_blocks": 190464, 00:11:49.983 "uuid": "8a2ff595-b1d7-4d41-a6d2-392a7416f06a", 00:11:49.983 "assigned_rate_limits": { 00:11:49.983 "rw_ios_per_sec": 0, 00:11:49.983 "rw_mbytes_per_sec": 0, 00:11:49.983 "r_mbytes_per_sec": 0, 00:11:49.983 "w_mbytes_per_sec": 0 00:11:49.983 }, 00:11:49.983 "claimed": false, 00:11:49.983 "zoned": false, 00:11:49.983 "supported_io_types": { 00:11:49.983 "read": true, 00:11:49.983 "write": true, 00:11:49.983 "unmap": true, 00:11:49.983 "flush": true, 00:11:49.983 "reset": true, 00:11:49.983 "nvme_admin": false, 00:11:49.983 "nvme_io": false, 00:11:49.983 "nvme_io_md": false, 00:11:49.983 "write_zeroes": true, 00:11:49.983 "zcopy": false, 00:11:49.983 "get_zone_info": false, 00:11:49.983 "zone_management": false, 00:11:49.983 "zone_append": false, 00:11:49.983 "compare": false, 00:11:49.983 "compare_and_write": false, 00:11:49.983 "abort": false, 00:11:49.983 "seek_hole": false, 00:11:49.983 "seek_data": false, 00:11:49.983 "copy": false, 00:11:49.983 "nvme_iov_md": false 00:11:49.983 }, 00:11:49.983 "memory_domains": [ 00:11:49.983 { 00:11:49.983 "dma_device_id": "system", 00:11:49.983 "dma_device_type": 1 00:11:49.983 }, 00:11:49.983 { 00:11:49.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.983 "dma_device_type": 2 00:11:49.983 }, 00:11:49.984 { 00:11:49.984 "dma_device_id": "system", 00:11:49.984 "dma_device_type": 1 00:11:49.984 }, 00:11:49.984 { 00:11:49.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.984 "dma_device_type": 2 00:11:49.984 }, 00:11:49.984 { 00:11:49.984 "dma_device_id": "system", 00:11:49.984 "dma_device_type": 1 00:11:49.984 }, 00:11:49.984 { 00:11:49.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.984 "dma_device_type": 2 00:11:49.984 } 00:11:49.984 ], 00:11:49.984 "driver_specific": { 00:11:49.984 "raid": { 00:11:49.984 "uuid": "8a2ff595-b1d7-4d41-a6d2-392a7416f06a", 00:11:49.984 "strip_size_kb": 64, 00:11:49.984 "state": "online", 00:11:49.984 "raid_level": "raid0", 00:11:49.984 "superblock": true, 00:11:49.984 "num_base_bdevs": 3, 00:11:49.984 "num_base_bdevs_discovered": 3, 00:11:49.984 "num_base_bdevs_operational": 3, 00:11:49.984 "base_bdevs_list": [ 00:11:49.984 { 00:11:49.984 "name": "BaseBdev1", 00:11:49.984 "uuid": "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5", 00:11:49.984 "is_configured": true, 00:11:49.984 "data_offset": 2048, 00:11:49.984 "data_size": 63488 00:11:49.984 }, 00:11:49.984 { 00:11:49.984 "name": "BaseBdev2", 00:11:49.984 "uuid": "a0bd2aeb-4ff7-4895-9529-358aeacdba3b", 00:11:49.984 "is_configured": true, 00:11:49.984 "data_offset": 2048, 00:11:49.984 "data_size": 63488 00:11:49.984 }, 00:11:49.984 { 00:11:49.984 "name": "BaseBdev3", 00:11:49.984 "uuid": "714fceca-071f-4114-888b-9e296aec5a0c", 00:11:49.984 "is_configured": true, 00:11:49.984 "data_offset": 2048, 00:11:49.984 "data_size": 63488 00:11:49.984 } 00:11:49.984 ] 00:11:49.984 } 00:11:49.984 } 00:11:49.984 }' 00:11:49.984 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:49.984 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:49.984 BaseBdev2 00:11:49.984 BaseBdev3' 00:11:49.984 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:49.984 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:49.984 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:50.241 "name": "BaseBdev1", 00:11:50.241 "aliases": [ 00:11:50.241 "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5" 00:11:50.241 ], 00:11:50.241 "product_name": "Malloc disk", 00:11:50.241 "block_size": 512, 00:11:50.241 "num_blocks": 65536, 00:11:50.241 "uuid": "c8d6112d-221f-4e69-b6b1-53fdadcf9ca5", 00:11:50.241 "assigned_rate_limits": { 00:11:50.241 "rw_ios_per_sec": 0, 00:11:50.241 "rw_mbytes_per_sec": 0, 00:11:50.241 "r_mbytes_per_sec": 0, 00:11:50.241 "w_mbytes_per_sec": 0 00:11:50.241 }, 00:11:50.241 "claimed": true, 00:11:50.241 "claim_type": "exclusive_write", 00:11:50.241 "zoned": false, 00:11:50.241 "supported_io_types": { 00:11:50.241 "read": true, 00:11:50.241 "write": true, 00:11:50.241 "unmap": true, 00:11:50.241 "flush": true, 00:11:50.241 "reset": true, 00:11:50.241 "nvme_admin": false, 00:11:50.241 "nvme_io": false, 00:11:50.241 "nvme_io_md": false, 00:11:50.241 "write_zeroes": true, 00:11:50.241 "zcopy": true, 00:11:50.241 "get_zone_info": false, 00:11:50.241 "zone_management": false, 00:11:50.241 "zone_append": false, 00:11:50.241 "compare": false, 00:11:50.241 "compare_and_write": false, 00:11:50.241 "abort": true, 00:11:50.241 "seek_hole": false, 00:11:50.241 "seek_data": false, 00:11:50.241 "copy": true, 00:11:50.241 "nvme_iov_md": false 00:11:50.241 }, 00:11:50.241 "memory_domains": [ 00:11:50.241 { 00:11:50.241 "dma_device_id": "system", 00:11:50.241 "dma_device_type": 1 00:11:50.241 }, 00:11:50.241 { 00:11:50.241 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.241 "dma_device_type": 2 00:11:50.241 } 00:11:50.241 ], 00:11:50.241 "driver_specific": {} 00:11:50.241 }' 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:50.241 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:50.498 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:50.755 "name": "BaseBdev2", 00:11:50.755 "aliases": [ 00:11:50.755 "a0bd2aeb-4ff7-4895-9529-358aeacdba3b" 00:11:50.755 ], 00:11:50.755 "product_name": "Malloc disk", 00:11:50.755 "block_size": 512, 00:11:50.755 "num_blocks": 65536, 00:11:50.755 "uuid": "a0bd2aeb-4ff7-4895-9529-358aeacdba3b", 00:11:50.755 "assigned_rate_limits": { 00:11:50.755 "rw_ios_per_sec": 0, 00:11:50.755 "rw_mbytes_per_sec": 0, 00:11:50.755 "r_mbytes_per_sec": 0, 00:11:50.755 "w_mbytes_per_sec": 0 00:11:50.755 }, 00:11:50.755 "claimed": true, 00:11:50.755 "claim_type": "exclusive_write", 00:11:50.755 "zoned": false, 00:11:50.755 "supported_io_types": { 00:11:50.755 "read": true, 00:11:50.755 "write": true, 00:11:50.755 "unmap": true, 00:11:50.755 "flush": true, 00:11:50.755 "reset": true, 00:11:50.755 "nvme_admin": false, 00:11:50.755 "nvme_io": false, 00:11:50.755 "nvme_io_md": false, 00:11:50.755 "write_zeroes": true, 00:11:50.755 "zcopy": true, 00:11:50.755 "get_zone_info": false, 00:11:50.755 "zone_management": false, 00:11:50.755 "zone_append": false, 00:11:50.755 "compare": false, 00:11:50.755 "compare_and_write": false, 00:11:50.755 "abort": true, 00:11:50.755 "seek_hole": false, 00:11:50.755 "seek_data": false, 00:11:50.755 "copy": true, 00:11:50.755 "nvme_iov_md": false 00:11:50.755 }, 00:11:50.755 "memory_domains": [ 00:11:50.755 { 00:11:50.755 "dma_device_id": "system", 00:11:50.755 "dma_device_type": 1 00:11:50.755 }, 00:11:50.755 { 00:11:50.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.755 "dma_device_type": 2 00:11:50.755 } 00:11:50.755 ], 00:11:50.755 "driver_specific": {} 00:11:50.755 }' 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.755 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.013 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:51.013 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.013 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.013 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.013 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.013 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:51.013 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.270 "name": "BaseBdev3", 00:11:51.270 "aliases": [ 00:11:51.270 "714fceca-071f-4114-888b-9e296aec5a0c" 00:11:51.270 ], 00:11:51.270 "product_name": "Malloc disk", 00:11:51.270 "block_size": 512, 00:11:51.270 "num_blocks": 65536, 00:11:51.270 "uuid": "714fceca-071f-4114-888b-9e296aec5a0c", 00:11:51.270 "assigned_rate_limits": { 00:11:51.270 "rw_ios_per_sec": 0, 00:11:51.270 "rw_mbytes_per_sec": 0, 00:11:51.270 "r_mbytes_per_sec": 0, 00:11:51.270 "w_mbytes_per_sec": 0 00:11:51.270 }, 00:11:51.270 "claimed": true, 00:11:51.270 "claim_type": "exclusive_write", 00:11:51.270 "zoned": false, 00:11:51.270 "supported_io_types": { 00:11:51.270 "read": true, 00:11:51.270 "write": true, 00:11:51.270 "unmap": true, 00:11:51.270 "flush": true, 00:11:51.270 "reset": true, 00:11:51.270 "nvme_admin": false, 00:11:51.270 "nvme_io": false, 00:11:51.270 "nvme_io_md": false, 00:11:51.270 "write_zeroes": true, 00:11:51.270 "zcopy": true, 00:11:51.270 "get_zone_info": false, 00:11:51.270 "zone_management": false, 00:11:51.270 "zone_append": false, 00:11:51.270 "compare": false, 00:11:51.270 "compare_and_write": false, 00:11:51.270 "abort": true, 00:11:51.270 "seek_hole": false, 00:11:51.270 "seek_data": false, 00:11:51.270 "copy": true, 00:11:51.270 "nvme_iov_md": false 00:11:51.270 }, 00:11:51.270 "memory_domains": [ 00:11:51.270 { 00:11:51.270 "dma_device_id": "system", 00:11:51.270 "dma_device_type": 1 00:11:51.270 }, 00:11:51.270 { 00:11:51.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.270 "dma_device_type": 2 00:11:51.270 } 00:11:51.270 ], 00:11:51.270 "driver_specific": {} 00:11:51.270 }' 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.270 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.528 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:51.528 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.528 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.528 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.528 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:51.528 [2024-07-15 13:34:39.124919] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:51.528 [2024-07-15 13:34:39.124942] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:51.528 [2024-07-15 13:34:39.124971] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:51.528 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:51.785 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:51.785 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:51.785 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.785 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:51.785 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:51.785 "name": "Existed_Raid", 00:11:51.785 "uuid": "8a2ff595-b1d7-4d41-a6d2-392a7416f06a", 00:11:51.785 "strip_size_kb": 64, 00:11:51.785 "state": "offline", 00:11:51.785 "raid_level": "raid0", 00:11:51.785 "superblock": true, 00:11:51.785 "num_base_bdevs": 3, 00:11:51.786 "num_base_bdevs_discovered": 2, 00:11:51.786 "num_base_bdevs_operational": 2, 00:11:51.786 "base_bdevs_list": [ 00:11:51.786 { 00:11:51.786 "name": null, 00:11:51.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:51.786 "is_configured": false, 00:11:51.786 "data_offset": 2048, 00:11:51.786 "data_size": 63488 00:11:51.786 }, 00:11:51.786 { 00:11:51.786 "name": "BaseBdev2", 00:11:51.786 "uuid": "a0bd2aeb-4ff7-4895-9529-358aeacdba3b", 00:11:51.786 "is_configured": true, 00:11:51.786 "data_offset": 2048, 00:11:51.786 "data_size": 63488 00:11:51.786 }, 00:11:51.786 { 00:11:51.786 "name": "BaseBdev3", 00:11:51.786 "uuid": "714fceca-071f-4114-888b-9e296aec5a0c", 00:11:51.786 "is_configured": true, 00:11:51.786 "data_offset": 2048, 00:11:51.786 "data_size": 63488 00:11:51.786 } 00:11:51.786 ] 00:11:51.786 }' 00:11:51.786 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:51.786 13:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:52.350 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:52.350 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:52.350 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.350 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:52.350 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:52.350 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:52.350 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:52.608 [2024-07-15 13:34:40.124284] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:52.608 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:52.608 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:52.608 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.608 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:52.866 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:52.866 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:52.866 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:53.124 [2024-07-15 13:34:40.487609] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:53.124 [2024-07-15 13:34:40.487641] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x177f710 name Existed_Raid, state offline 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:53.124 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:53.382 BaseBdev2 00:11:53.382 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:53.382 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:53.382 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:53.382 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:53.382 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:53.382 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:53.382 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:53.640 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:53.640 [ 00:11:53.640 { 00:11:53.640 "name": "BaseBdev2", 00:11:53.640 "aliases": [ 00:11:53.640 "4638703c-7c84-41c8-a2b3-187d5b7bac46" 00:11:53.640 ], 00:11:53.640 "product_name": "Malloc disk", 00:11:53.640 "block_size": 512, 00:11:53.640 "num_blocks": 65536, 00:11:53.640 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:11:53.640 "assigned_rate_limits": { 00:11:53.640 "rw_ios_per_sec": 0, 00:11:53.640 "rw_mbytes_per_sec": 0, 00:11:53.640 "r_mbytes_per_sec": 0, 00:11:53.640 "w_mbytes_per_sec": 0 00:11:53.640 }, 00:11:53.640 "claimed": false, 00:11:53.640 "zoned": false, 00:11:53.640 "supported_io_types": { 00:11:53.640 "read": true, 00:11:53.640 "write": true, 00:11:53.640 "unmap": true, 00:11:53.640 "flush": true, 00:11:53.640 "reset": true, 00:11:53.640 "nvme_admin": false, 00:11:53.640 "nvme_io": false, 00:11:53.640 "nvme_io_md": false, 00:11:53.640 "write_zeroes": true, 00:11:53.640 "zcopy": true, 00:11:53.640 "get_zone_info": false, 00:11:53.640 "zone_management": false, 00:11:53.640 "zone_append": false, 00:11:53.640 "compare": false, 00:11:53.640 "compare_and_write": false, 00:11:53.640 "abort": true, 00:11:53.640 "seek_hole": false, 00:11:53.640 "seek_data": false, 00:11:53.640 "copy": true, 00:11:53.640 "nvme_iov_md": false 00:11:53.640 }, 00:11:53.640 "memory_domains": [ 00:11:53.640 { 00:11:53.640 "dma_device_id": "system", 00:11:53.640 "dma_device_type": 1 00:11:53.640 }, 00:11:53.640 { 00:11:53.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.640 "dma_device_type": 2 00:11:53.640 } 00:11:53.640 ], 00:11:53.640 "driver_specific": {} 00:11:53.640 } 00:11:53.640 ] 00:11:53.640 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:53.640 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:53.640 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:53.640 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:53.898 BaseBdev3 00:11:53.898 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:53.898 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:53.898 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:53.898 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:53.898 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:53.898 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:53.898 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:54.154 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:54.154 [ 00:11:54.154 { 00:11:54.154 "name": "BaseBdev3", 00:11:54.154 "aliases": [ 00:11:54.154 "572a780b-2f79-4911-88ee-e2b5d3542c7b" 00:11:54.154 ], 00:11:54.154 "product_name": "Malloc disk", 00:11:54.154 "block_size": 512, 00:11:54.154 "num_blocks": 65536, 00:11:54.154 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:11:54.154 "assigned_rate_limits": { 00:11:54.154 "rw_ios_per_sec": 0, 00:11:54.154 "rw_mbytes_per_sec": 0, 00:11:54.154 "r_mbytes_per_sec": 0, 00:11:54.154 "w_mbytes_per_sec": 0 00:11:54.154 }, 00:11:54.154 "claimed": false, 00:11:54.154 "zoned": false, 00:11:54.154 "supported_io_types": { 00:11:54.154 "read": true, 00:11:54.154 "write": true, 00:11:54.154 "unmap": true, 00:11:54.154 "flush": true, 00:11:54.154 "reset": true, 00:11:54.154 "nvme_admin": false, 00:11:54.154 "nvme_io": false, 00:11:54.154 "nvme_io_md": false, 00:11:54.154 "write_zeroes": true, 00:11:54.154 "zcopy": true, 00:11:54.154 "get_zone_info": false, 00:11:54.154 "zone_management": false, 00:11:54.154 "zone_append": false, 00:11:54.154 "compare": false, 00:11:54.154 "compare_and_write": false, 00:11:54.154 "abort": true, 00:11:54.154 "seek_hole": false, 00:11:54.154 "seek_data": false, 00:11:54.154 "copy": true, 00:11:54.154 "nvme_iov_md": false 00:11:54.154 }, 00:11:54.154 "memory_domains": [ 00:11:54.154 { 00:11:54.154 "dma_device_id": "system", 00:11:54.154 "dma_device_type": 1 00:11:54.154 }, 00:11:54.154 { 00:11:54.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.154 "dma_device_type": 2 00:11:54.154 } 00:11:54.154 ], 00:11:54.154 "driver_specific": {} 00:11:54.154 } 00:11:54.154 ] 00:11:54.154 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:54.154 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:54.154 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:54.154 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:54.410 [2024-07-15 13:34:41.874588] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:54.410 [2024-07-15 13:34:41.874622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:54.410 [2024-07-15 13:34:41.874636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:54.410 [2024-07-15 13:34:41.875657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.410 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.667 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.667 "name": "Existed_Raid", 00:11:54.667 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:11:54.667 "strip_size_kb": 64, 00:11:54.667 "state": "configuring", 00:11:54.667 "raid_level": "raid0", 00:11:54.667 "superblock": true, 00:11:54.667 "num_base_bdevs": 3, 00:11:54.667 "num_base_bdevs_discovered": 2, 00:11:54.667 "num_base_bdevs_operational": 3, 00:11:54.667 "base_bdevs_list": [ 00:11:54.667 { 00:11:54.667 "name": "BaseBdev1", 00:11:54.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.667 "is_configured": false, 00:11:54.667 "data_offset": 0, 00:11:54.667 "data_size": 0 00:11:54.667 }, 00:11:54.667 { 00:11:54.667 "name": "BaseBdev2", 00:11:54.667 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:11:54.667 "is_configured": true, 00:11:54.667 "data_offset": 2048, 00:11:54.667 "data_size": 63488 00:11:54.667 }, 00:11:54.667 { 00:11:54.667 "name": "BaseBdev3", 00:11:54.667 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:11:54.667 "is_configured": true, 00:11:54.667 "data_offset": 2048, 00:11:54.667 "data_size": 63488 00:11:54.667 } 00:11:54.667 ] 00:11:54.667 }' 00:11:54.667 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.667 13:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.923 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:55.180 [2024-07-15 13:34:42.676619] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.180 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.437 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.437 "name": "Existed_Raid", 00:11:55.437 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:11:55.437 "strip_size_kb": 64, 00:11:55.437 "state": "configuring", 00:11:55.437 "raid_level": "raid0", 00:11:55.437 "superblock": true, 00:11:55.437 "num_base_bdevs": 3, 00:11:55.437 "num_base_bdevs_discovered": 1, 00:11:55.437 "num_base_bdevs_operational": 3, 00:11:55.437 "base_bdevs_list": [ 00:11:55.437 { 00:11:55.437 "name": "BaseBdev1", 00:11:55.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.437 "is_configured": false, 00:11:55.437 "data_offset": 0, 00:11:55.437 "data_size": 0 00:11:55.437 }, 00:11:55.437 { 00:11:55.437 "name": null, 00:11:55.437 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:11:55.437 "is_configured": false, 00:11:55.437 "data_offset": 2048, 00:11:55.437 "data_size": 63488 00:11:55.437 }, 00:11:55.437 { 00:11:55.437 "name": "BaseBdev3", 00:11:55.437 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:11:55.437 "is_configured": true, 00:11:55.437 "data_offset": 2048, 00:11:55.437 "data_size": 63488 00:11:55.437 } 00:11:55.437 ] 00:11:55.437 }' 00:11:55.437 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.437 13:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.001 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.001 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:56.001 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:56.001 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:56.258 [2024-07-15 13:34:43.699081] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:56.258 BaseBdev1 00:11:56.258 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:56.258 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:56.258 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:56.258 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:56.258 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:56.258 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:56.259 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:56.516 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:56.516 [ 00:11:56.516 { 00:11:56.516 "name": "BaseBdev1", 00:11:56.516 "aliases": [ 00:11:56.516 "d53aa327-593f-4c9d-be98-db5ee4ae2fe3" 00:11:56.516 ], 00:11:56.516 "product_name": "Malloc disk", 00:11:56.516 "block_size": 512, 00:11:56.516 "num_blocks": 65536, 00:11:56.516 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:11:56.516 "assigned_rate_limits": { 00:11:56.516 "rw_ios_per_sec": 0, 00:11:56.516 "rw_mbytes_per_sec": 0, 00:11:56.516 "r_mbytes_per_sec": 0, 00:11:56.516 "w_mbytes_per_sec": 0 00:11:56.516 }, 00:11:56.516 "claimed": true, 00:11:56.516 "claim_type": "exclusive_write", 00:11:56.516 "zoned": false, 00:11:56.516 "supported_io_types": { 00:11:56.516 "read": true, 00:11:56.516 "write": true, 00:11:56.516 "unmap": true, 00:11:56.516 "flush": true, 00:11:56.516 "reset": true, 00:11:56.516 "nvme_admin": false, 00:11:56.516 "nvme_io": false, 00:11:56.516 "nvme_io_md": false, 00:11:56.516 "write_zeroes": true, 00:11:56.516 "zcopy": true, 00:11:56.516 "get_zone_info": false, 00:11:56.516 "zone_management": false, 00:11:56.516 "zone_append": false, 00:11:56.516 "compare": false, 00:11:56.516 "compare_and_write": false, 00:11:56.516 "abort": true, 00:11:56.516 "seek_hole": false, 00:11:56.516 "seek_data": false, 00:11:56.516 "copy": true, 00:11:56.516 "nvme_iov_md": false 00:11:56.516 }, 00:11:56.516 "memory_domains": [ 00:11:56.516 { 00:11:56.516 "dma_device_id": "system", 00:11:56.516 "dma_device_type": 1 00:11:56.516 }, 00:11:56.516 { 00:11:56.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.516 "dma_device_type": 2 00:11:56.516 } 00:11:56.516 ], 00:11:56.516 "driver_specific": {} 00:11:56.516 } 00:11:56.516 ] 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.516 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:56.773 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:56.774 "name": "Existed_Raid", 00:11:56.774 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:11:56.774 "strip_size_kb": 64, 00:11:56.774 "state": "configuring", 00:11:56.774 "raid_level": "raid0", 00:11:56.774 "superblock": true, 00:11:56.774 "num_base_bdevs": 3, 00:11:56.774 "num_base_bdevs_discovered": 2, 00:11:56.774 "num_base_bdevs_operational": 3, 00:11:56.774 "base_bdevs_list": [ 00:11:56.774 { 00:11:56.774 "name": "BaseBdev1", 00:11:56.774 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:11:56.774 "is_configured": true, 00:11:56.774 "data_offset": 2048, 00:11:56.774 "data_size": 63488 00:11:56.774 }, 00:11:56.774 { 00:11:56.774 "name": null, 00:11:56.774 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:11:56.774 "is_configured": false, 00:11:56.774 "data_offset": 2048, 00:11:56.774 "data_size": 63488 00:11:56.774 }, 00:11:56.774 { 00:11:56.774 "name": "BaseBdev3", 00:11:56.774 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:11:56.774 "is_configured": true, 00:11:56.774 "data_offset": 2048, 00:11:56.774 "data_size": 63488 00:11:56.774 } 00:11:56.774 ] 00:11:56.774 }' 00:11:56.774 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:56.774 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:57.338 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.338 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:57.338 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:57.338 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:57.596 [2024-07-15 13:34:45.062644] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.596 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.853 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.853 "name": "Existed_Raid", 00:11:57.853 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:11:57.853 "strip_size_kb": 64, 00:11:57.853 "state": "configuring", 00:11:57.853 "raid_level": "raid0", 00:11:57.853 "superblock": true, 00:11:57.853 "num_base_bdevs": 3, 00:11:57.853 "num_base_bdevs_discovered": 1, 00:11:57.853 "num_base_bdevs_operational": 3, 00:11:57.853 "base_bdevs_list": [ 00:11:57.853 { 00:11:57.853 "name": "BaseBdev1", 00:11:57.853 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:11:57.853 "is_configured": true, 00:11:57.853 "data_offset": 2048, 00:11:57.853 "data_size": 63488 00:11:57.853 }, 00:11:57.853 { 00:11:57.853 "name": null, 00:11:57.853 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:11:57.853 "is_configured": false, 00:11:57.853 "data_offset": 2048, 00:11:57.853 "data_size": 63488 00:11:57.853 }, 00:11:57.853 { 00:11:57.853 "name": null, 00:11:57.853 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:11:57.853 "is_configured": false, 00:11:57.853 "data_offset": 2048, 00:11:57.853 "data_size": 63488 00:11:57.853 } 00:11:57.853 ] 00:11:57.853 }' 00:11:57.853 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.853 13:34:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:58.482 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.482 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:58.482 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:58.482 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:58.741 [2024-07-15 13:34:46.117375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.741 "name": "Existed_Raid", 00:11:58.741 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:11:58.741 "strip_size_kb": 64, 00:11:58.741 "state": "configuring", 00:11:58.741 "raid_level": "raid0", 00:11:58.741 "superblock": true, 00:11:58.741 "num_base_bdevs": 3, 00:11:58.741 "num_base_bdevs_discovered": 2, 00:11:58.741 "num_base_bdevs_operational": 3, 00:11:58.741 "base_bdevs_list": [ 00:11:58.741 { 00:11:58.741 "name": "BaseBdev1", 00:11:58.741 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:11:58.741 "is_configured": true, 00:11:58.741 "data_offset": 2048, 00:11:58.741 "data_size": 63488 00:11:58.741 }, 00:11:58.741 { 00:11:58.741 "name": null, 00:11:58.741 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:11:58.741 "is_configured": false, 00:11:58.741 "data_offset": 2048, 00:11:58.741 "data_size": 63488 00:11:58.741 }, 00:11:58.741 { 00:11:58.741 "name": "BaseBdev3", 00:11:58.741 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:11:58.741 "is_configured": true, 00:11:58.741 "data_offset": 2048, 00:11:58.741 "data_size": 63488 00:11:58.741 } 00:11:58.741 ] 00:11:58.741 }' 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.741 13:34:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:59.307 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:59.307 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.566 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:59.566 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:59.566 [2024-07-15 13:34:47.111951] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.566 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:59.824 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.824 "name": "Existed_Raid", 00:11:59.824 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:11:59.824 "strip_size_kb": 64, 00:11:59.824 "state": "configuring", 00:11:59.824 "raid_level": "raid0", 00:11:59.824 "superblock": true, 00:11:59.824 "num_base_bdevs": 3, 00:11:59.824 "num_base_bdevs_discovered": 1, 00:11:59.824 "num_base_bdevs_operational": 3, 00:11:59.824 "base_bdevs_list": [ 00:11:59.824 { 00:11:59.824 "name": null, 00:11:59.824 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:11:59.824 "is_configured": false, 00:11:59.824 "data_offset": 2048, 00:11:59.824 "data_size": 63488 00:11:59.824 }, 00:11:59.824 { 00:11:59.824 "name": null, 00:11:59.824 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:11:59.824 "is_configured": false, 00:11:59.824 "data_offset": 2048, 00:11:59.824 "data_size": 63488 00:11:59.824 }, 00:11:59.824 { 00:11:59.824 "name": "BaseBdev3", 00:11:59.824 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:11:59.824 "is_configured": true, 00:11:59.824 "data_offset": 2048, 00:11:59.824 "data_size": 63488 00:11:59.824 } 00:11:59.825 ] 00:11:59.825 }' 00:11:59.825 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.825 13:34:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:00.391 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.391 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:00.391 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:00.391 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:00.650 [2024-07-15 13:34:48.134468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.650 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.909 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.909 "name": "Existed_Raid", 00:12:00.909 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:12:00.909 "strip_size_kb": 64, 00:12:00.909 "state": "configuring", 00:12:00.909 "raid_level": "raid0", 00:12:00.909 "superblock": true, 00:12:00.909 "num_base_bdevs": 3, 00:12:00.909 "num_base_bdevs_discovered": 2, 00:12:00.909 "num_base_bdevs_operational": 3, 00:12:00.909 "base_bdevs_list": [ 00:12:00.909 { 00:12:00.909 "name": null, 00:12:00.909 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:12:00.909 "is_configured": false, 00:12:00.909 "data_offset": 2048, 00:12:00.909 "data_size": 63488 00:12:00.909 }, 00:12:00.909 { 00:12:00.909 "name": "BaseBdev2", 00:12:00.909 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:12:00.909 "is_configured": true, 00:12:00.909 "data_offset": 2048, 00:12:00.909 "data_size": 63488 00:12:00.909 }, 00:12:00.909 { 00:12:00.909 "name": "BaseBdev3", 00:12:00.909 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:12:00.909 "is_configured": true, 00:12:00.909 "data_offset": 2048, 00:12:00.909 "data_size": 63488 00:12:00.909 } 00:12:00.909 ] 00:12:00.909 }' 00:12:00.909 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.909 13:34:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:01.476 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.476 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:01.476 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:01.476 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:01.476 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.735 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d53aa327-593f-4c9d-be98-db5ee4ae2fe3 00:12:01.735 [2024-07-15 13:34:49.344444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:01.735 [2024-07-15 13:34:49.344559] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x177dc50 00:12:01.735 [2024-07-15 13:34:49.344568] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:01.735 [2024-07-15 13:34:49.344681] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x192c4d0 00:12:01.735 [2024-07-15 13:34:49.344756] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x177dc50 00:12:01.735 [2024-07-15 13:34:49.344763] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x177dc50 00:12:01.735 [2024-07-15 13:34:49.344823] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:01.735 NewBaseBdev 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:01.993 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:02.251 [ 00:12:02.251 { 00:12:02.251 "name": "NewBaseBdev", 00:12:02.251 "aliases": [ 00:12:02.251 "d53aa327-593f-4c9d-be98-db5ee4ae2fe3" 00:12:02.251 ], 00:12:02.251 "product_name": "Malloc disk", 00:12:02.251 "block_size": 512, 00:12:02.251 "num_blocks": 65536, 00:12:02.251 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:12:02.251 "assigned_rate_limits": { 00:12:02.251 "rw_ios_per_sec": 0, 00:12:02.251 "rw_mbytes_per_sec": 0, 00:12:02.251 "r_mbytes_per_sec": 0, 00:12:02.251 "w_mbytes_per_sec": 0 00:12:02.251 }, 00:12:02.251 "claimed": true, 00:12:02.251 "claim_type": "exclusive_write", 00:12:02.251 "zoned": false, 00:12:02.251 "supported_io_types": { 00:12:02.251 "read": true, 00:12:02.251 "write": true, 00:12:02.251 "unmap": true, 00:12:02.251 "flush": true, 00:12:02.251 "reset": true, 00:12:02.251 "nvme_admin": false, 00:12:02.251 "nvme_io": false, 00:12:02.251 "nvme_io_md": false, 00:12:02.251 "write_zeroes": true, 00:12:02.251 "zcopy": true, 00:12:02.251 "get_zone_info": false, 00:12:02.251 "zone_management": false, 00:12:02.251 "zone_append": false, 00:12:02.251 "compare": false, 00:12:02.251 "compare_and_write": false, 00:12:02.251 "abort": true, 00:12:02.251 "seek_hole": false, 00:12:02.251 "seek_data": false, 00:12:02.251 "copy": true, 00:12:02.251 "nvme_iov_md": false 00:12:02.251 }, 00:12:02.251 "memory_domains": [ 00:12:02.251 { 00:12:02.251 "dma_device_id": "system", 00:12:02.251 "dma_device_type": 1 00:12:02.251 }, 00:12:02.251 { 00:12:02.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.251 "dma_device_type": 2 00:12:02.251 } 00:12:02.251 ], 00:12:02.251 "driver_specific": {} 00:12:02.251 } 00:12:02.251 ] 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.251 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.509 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.509 "name": "Existed_Raid", 00:12:02.509 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:12:02.509 "strip_size_kb": 64, 00:12:02.509 "state": "online", 00:12:02.509 "raid_level": "raid0", 00:12:02.509 "superblock": true, 00:12:02.509 "num_base_bdevs": 3, 00:12:02.509 "num_base_bdevs_discovered": 3, 00:12:02.509 "num_base_bdevs_operational": 3, 00:12:02.509 "base_bdevs_list": [ 00:12:02.509 { 00:12:02.509 "name": "NewBaseBdev", 00:12:02.509 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:12:02.509 "is_configured": true, 00:12:02.509 "data_offset": 2048, 00:12:02.509 "data_size": 63488 00:12:02.509 }, 00:12:02.509 { 00:12:02.509 "name": "BaseBdev2", 00:12:02.509 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:12:02.509 "is_configured": true, 00:12:02.509 "data_offset": 2048, 00:12:02.509 "data_size": 63488 00:12:02.509 }, 00:12:02.509 { 00:12:02.509 "name": "BaseBdev3", 00:12:02.509 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:12:02.509 "is_configured": true, 00:12:02.509 "data_offset": 2048, 00:12:02.509 "data_size": 63488 00:12:02.509 } 00:12:02.509 ] 00:12:02.509 }' 00:12:02.509 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.509 13:34:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:03.075 [2024-07-15 13:34:50.551775] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.075 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:03.075 "name": "Existed_Raid", 00:12:03.075 "aliases": [ 00:12:03.075 "f3bbf0e4-ba03-4e52-90bc-40273411c78c" 00:12:03.075 ], 00:12:03.075 "product_name": "Raid Volume", 00:12:03.075 "block_size": 512, 00:12:03.075 "num_blocks": 190464, 00:12:03.075 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:12:03.075 "assigned_rate_limits": { 00:12:03.075 "rw_ios_per_sec": 0, 00:12:03.075 "rw_mbytes_per_sec": 0, 00:12:03.075 "r_mbytes_per_sec": 0, 00:12:03.075 "w_mbytes_per_sec": 0 00:12:03.075 }, 00:12:03.075 "claimed": false, 00:12:03.075 "zoned": false, 00:12:03.075 "supported_io_types": { 00:12:03.075 "read": true, 00:12:03.075 "write": true, 00:12:03.075 "unmap": true, 00:12:03.075 "flush": true, 00:12:03.075 "reset": true, 00:12:03.075 "nvme_admin": false, 00:12:03.075 "nvme_io": false, 00:12:03.075 "nvme_io_md": false, 00:12:03.075 "write_zeroes": true, 00:12:03.075 "zcopy": false, 00:12:03.075 "get_zone_info": false, 00:12:03.075 "zone_management": false, 00:12:03.075 "zone_append": false, 00:12:03.075 "compare": false, 00:12:03.075 "compare_and_write": false, 00:12:03.075 "abort": false, 00:12:03.075 "seek_hole": false, 00:12:03.075 "seek_data": false, 00:12:03.075 "copy": false, 00:12:03.075 "nvme_iov_md": false 00:12:03.075 }, 00:12:03.075 "memory_domains": [ 00:12:03.075 { 00:12:03.075 "dma_device_id": "system", 00:12:03.075 "dma_device_type": 1 00:12:03.075 }, 00:12:03.075 { 00:12:03.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.075 "dma_device_type": 2 00:12:03.075 }, 00:12:03.075 { 00:12:03.075 "dma_device_id": "system", 00:12:03.075 "dma_device_type": 1 00:12:03.075 }, 00:12:03.075 { 00:12:03.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.075 "dma_device_type": 2 00:12:03.075 }, 00:12:03.075 { 00:12:03.075 "dma_device_id": "system", 00:12:03.075 "dma_device_type": 1 00:12:03.075 }, 00:12:03.075 { 00:12:03.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.075 "dma_device_type": 2 00:12:03.075 } 00:12:03.075 ], 00:12:03.075 "driver_specific": { 00:12:03.075 "raid": { 00:12:03.075 "uuid": "f3bbf0e4-ba03-4e52-90bc-40273411c78c", 00:12:03.075 "strip_size_kb": 64, 00:12:03.075 "state": "online", 00:12:03.075 "raid_level": "raid0", 00:12:03.075 "superblock": true, 00:12:03.075 "num_base_bdevs": 3, 00:12:03.075 "num_base_bdevs_discovered": 3, 00:12:03.075 "num_base_bdevs_operational": 3, 00:12:03.075 "base_bdevs_list": [ 00:12:03.075 { 00:12:03.075 "name": "NewBaseBdev", 00:12:03.075 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:12:03.075 "is_configured": true, 00:12:03.075 "data_offset": 2048, 00:12:03.075 "data_size": 63488 00:12:03.075 }, 00:12:03.075 { 00:12:03.075 "name": "BaseBdev2", 00:12:03.075 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:12:03.075 "is_configured": true, 00:12:03.076 "data_offset": 2048, 00:12:03.076 "data_size": 63488 00:12:03.076 }, 00:12:03.076 { 00:12:03.076 "name": "BaseBdev3", 00:12:03.076 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:12:03.076 "is_configured": true, 00:12:03.076 "data_offset": 2048, 00:12:03.076 "data_size": 63488 00:12:03.076 } 00:12:03.076 ] 00:12:03.076 } 00:12:03.076 } 00:12:03.076 }' 00:12:03.076 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:03.076 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:03.076 BaseBdev2 00:12:03.076 BaseBdev3' 00:12:03.076 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.076 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:03.076 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.334 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.334 "name": "NewBaseBdev", 00:12:03.334 "aliases": [ 00:12:03.334 "d53aa327-593f-4c9d-be98-db5ee4ae2fe3" 00:12:03.334 ], 00:12:03.334 "product_name": "Malloc disk", 00:12:03.334 "block_size": 512, 00:12:03.334 "num_blocks": 65536, 00:12:03.334 "uuid": "d53aa327-593f-4c9d-be98-db5ee4ae2fe3", 00:12:03.334 "assigned_rate_limits": { 00:12:03.334 "rw_ios_per_sec": 0, 00:12:03.334 "rw_mbytes_per_sec": 0, 00:12:03.334 "r_mbytes_per_sec": 0, 00:12:03.334 "w_mbytes_per_sec": 0 00:12:03.334 }, 00:12:03.334 "claimed": true, 00:12:03.334 "claim_type": "exclusive_write", 00:12:03.334 "zoned": false, 00:12:03.334 "supported_io_types": { 00:12:03.334 "read": true, 00:12:03.334 "write": true, 00:12:03.334 "unmap": true, 00:12:03.334 "flush": true, 00:12:03.334 "reset": true, 00:12:03.334 "nvme_admin": false, 00:12:03.334 "nvme_io": false, 00:12:03.334 "nvme_io_md": false, 00:12:03.334 "write_zeroes": true, 00:12:03.334 "zcopy": true, 00:12:03.334 "get_zone_info": false, 00:12:03.334 "zone_management": false, 00:12:03.334 "zone_append": false, 00:12:03.334 "compare": false, 00:12:03.334 "compare_and_write": false, 00:12:03.334 "abort": true, 00:12:03.334 "seek_hole": false, 00:12:03.334 "seek_data": false, 00:12:03.334 "copy": true, 00:12:03.334 "nvme_iov_md": false 00:12:03.334 }, 00:12:03.334 "memory_domains": [ 00:12:03.334 { 00:12:03.334 "dma_device_id": "system", 00:12:03.334 "dma_device_type": 1 00:12:03.334 }, 00:12:03.334 { 00:12:03.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.334 "dma_device_type": 2 00:12:03.334 } 00:12:03.334 ], 00:12:03.334 "driver_specific": {} 00:12:03.334 }' 00:12:03.334 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.334 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.334 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.334 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.334 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.593 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.593 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:03.593 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.850 "name": "BaseBdev2", 00:12:03.850 "aliases": [ 00:12:03.850 "4638703c-7c84-41c8-a2b3-187d5b7bac46" 00:12:03.850 ], 00:12:03.850 "product_name": "Malloc disk", 00:12:03.850 "block_size": 512, 00:12:03.850 "num_blocks": 65536, 00:12:03.850 "uuid": "4638703c-7c84-41c8-a2b3-187d5b7bac46", 00:12:03.850 "assigned_rate_limits": { 00:12:03.850 "rw_ios_per_sec": 0, 00:12:03.850 "rw_mbytes_per_sec": 0, 00:12:03.850 "r_mbytes_per_sec": 0, 00:12:03.850 "w_mbytes_per_sec": 0 00:12:03.850 }, 00:12:03.850 "claimed": true, 00:12:03.850 "claim_type": "exclusive_write", 00:12:03.850 "zoned": false, 00:12:03.850 "supported_io_types": { 00:12:03.850 "read": true, 00:12:03.850 "write": true, 00:12:03.850 "unmap": true, 00:12:03.850 "flush": true, 00:12:03.850 "reset": true, 00:12:03.850 "nvme_admin": false, 00:12:03.850 "nvme_io": false, 00:12:03.850 "nvme_io_md": false, 00:12:03.850 "write_zeroes": true, 00:12:03.850 "zcopy": true, 00:12:03.850 "get_zone_info": false, 00:12:03.850 "zone_management": false, 00:12:03.850 "zone_append": false, 00:12:03.850 "compare": false, 00:12:03.850 "compare_and_write": false, 00:12:03.850 "abort": true, 00:12:03.850 "seek_hole": false, 00:12:03.850 "seek_data": false, 00:12:03.850 "copy": true, 00:12:03.850 "nvme_iov_md": false 00:12:03.850 }, 00:12:03.850 "memory_domains": [ 00:12:03.850 { 00:12:03.850 "dma_device_id": "system", 00:12:03.850 "dma_device_type": 1 00:12:03.850 }, 00:12:03.850 { 00:12:03.850 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.850 "dma_device_type": 2 00:12:03.850 } 00:12:03.850 ], 00:12:03.850 "driver_specific": {} 00:12:03.850 }' 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.850 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.107 "name": "BaseBdev3", 00:12:04.107 "aliases": [ 00:12:04.107 "572a780b-2f79-4911-88ee-e2b5d3542c7b" 00:12:04.107 ], 00:12:04.107 "product_name": "Malloc disk", 00:12:04.107 "block_size": 512, 00:12:04.107 "num_blocks": 65536, 00:12:04.107 "uuid": "572a780b-2f79-4911-88ee-e2b5d3542c7b", 00:12:04.107 "assigned_rate_limits": { 00:12:04.107 "rw_ios_per_sec": 0, 00:12:04.107 "rw_mbytes_per_sec": 0, 00:12:04.107 "r_mbytes_per_sec": 0, 00:12:04.107 "w_mbytes_per_sec": 0 00:12:04.107 }, 00:12:04.107 "claimed": true, 00:12:04.107 "claim_type": "exclusive_write", 00:12:04.107 "zoned": false, 00:12:04.107 "supported_io_types": { 00:12:04.107 "read": true, 00:12:04.107 "write": true, 00:12:04.107 "unmap": true, 00:12:04.107 "flush": true, 00:12:04.107 "reset": true, 00:12:04.107 "nvme_admin": false, 00:12:04.107 "nvme_io": false, 00:12:04.107 "nvme_io_md": false, 00:12:04.107 "write_zeroes": true, 00:12:04.107 "zcopy": true, 00:12:04.107 "get_zone_info": false, 00:12:04.107 "zone_management": false, 00:12:04.107 "zone_append": false, 00:12:04.107 "compare": false, 00:12:04.107 "compare_and_write": false, 00:12:04.107 "abort": true, 00:12:04.107 "seek_hole": false, 00:12:04.107 "seek_data": false, 00:12:04.107 "copy": true, 00:12:04.107 "nvme_iov_md": false 00:12:04.107 }, 00:12:04.107 "memory_domains": [ 00:12:04.107 { 00:12:04.107 "dma_device_id": "system", 00:12:04.107 "dma_device_type": 1 00:12:04.107 }, 00:12:04.107 { 00:12:04.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.107 "dma_device_type": 2 00:12:04.107 } 00:12:04.107 ], 00:12:04.107 "driver_specific": {} 00:12:04.107 }' 00:12:04.107 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.365 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.622 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:04.622 [2024-07-15 13:34:52.203876] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:04.622 [2024-07-15 13:34:52.203899] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:04.622 [2024-07-15 13:34:52.203939] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:04.622 [2024-07-15 13:34:52.203977] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:04.622 [2024-07-15 13:34:52.203985] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x177dc50 name Existed_Raid, state offline 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4185705 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4185705 ']' 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4185705 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:04.622 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4185705 00:12:04.879 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:04.879 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:04.879 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4185705' 00:12:04.879 killing process with pid 4185705 00:12:04.879 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4185705 00:12:04.879 [2024-07-15 13:34:52.268273] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:04.879 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4185705 00:12:04.879 [2024-07-15 13:34:52.298113] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:05.137 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:05.137 00:12:05.137 real 0m21.847s 00:12:05.137 user 0m39.828s 00:12:05.137 sys 0m4.228s 00:12:05.137 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:05.137 13:34:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:05.137 ************************************ 00:12:05.137 END TEST raid_state_function_test_sb 00:12:05.137 ************************************ 00:12:05.137 13:34:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:05.137 13:34:52 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:12:05.137 13:34:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:05.137 13:34:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:05.137 13:34:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:05.137 ************************************ 00:12:05.137 START TEST raid_superblock_test 00:12:05.137 ************************************ 00:12:05.137 13:34:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:12:05.137 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:12:05.137 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:12:05.137 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:05.137 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:05.137 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4189205 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4189205 /var/tmp/spdk-raid.sock 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4189205 ']' 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:05.138 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:05.138 13:34:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.138 [2024-07-15 13:34:52.627277] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:12:05.138 [2024-07-15 13:34:52.627326] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4189205 ] 00:12:05.138 [2024-07-15 13:34:52.714530] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:05.454 [2024-07-15 13:34:52.807472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:05.454 [2024-07-15 13:34:52.868153] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.454 [2024-07-15 13:34:52.868184] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:06.018 malloc1 00:12:06.018 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:06.275 [2024-07-15 13:34:53.750088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:06.275 [2024-07-15 13:34:53.750127] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:06.275 [2024-07-15 13:34:53.750142] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1772260 00:12:06.275 [2024-07-15 13:34:53.750150] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:06.275 [2024-07-15 13:34:53.751523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:06.275 [2024-07-15 13:34:53.751548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:06.275 pt1 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:06.275 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:06.533 malloc2 00:12:06.533 13:34:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:06.533 [2024-07-15 13:34:54.112181] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:06.533 [2024-07-15 13:34:54.112221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:06.533 [2024-07-15 13:34:54.112250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191c310 00:12:06.533 [2024-07-15 13:34:54.112259] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:06.533 [2024-07-15 13:34:54.113486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:06.533 [2024-07-15 13:34:54.113510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:06.533 pt2 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:06.533 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:12:06.791 malloc3 00:12:06.791 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:07.049 [2024-07-15 13:34:54.466029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:07.049 [2024-07-15 13:34:54.466062] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.049 [2024-07-15 13:34:54.466090] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191fe70 00:12:07.049 [2024-07-15 13:34:54.466099] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.049 [2024-07-15 13:34:54.467124] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.049 [2024-07-15 13:34:54.467152] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:07.049 pt3 00:12:07.049 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:07.049 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:07.049 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:12:07.049 [2024-07-15 13:34:54.654540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:07.049 [2024-07-15 13:34:54.655484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:07.049 [2024-07-15 13:34:54.655523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:07.049 [2024-07-15 13:34:54.655628] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1920e80 00:12:07.049 [2024-07-15 13:34:54.655635] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:07.049 [2024-07-15 13:34:54.655780] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x191b490 00:12:07.049 [2024-07-15 13:34:54.655880] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1920e80 00:12:07.049 [2024-07-15 13:34:54.655886] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1920e80 00:12:07.049 [2024-07-15 13:34:54.655957] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.308 "name": "raid_bdev1", 00:12:07.308 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:07.308 "strip_size_kb": 64, 00:12:07.308 "state": "online", 00:12:07.308 "raid_level": "raid0", 00:12:07.308 "superblock": true, 00:12:07.308 "num_base_bdevs": 3, 00:12:07.308 "num_base_bdevs_discovered": 3, 00:12:07.308 "num_base_bdevs_operational": 3, 00:12:07.308 "base_bdevs_list": [ 00:12:07.308 { 00:12:07.308 "name": "pt1", 00:12:07.308 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:07.308 "is_configured": true, 00:12:07.308 "data_offset": 2048, 00:12:07.308 "data_size": 63488 00:12:07.308 }, 00:12:07.308 { 00:12:07.308 "name": "pt2", 00:12:07.308 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:07.308 "is_configured": true, 00:12:07.308 "data_offset": 2048, 00:12:07.308 "data_size": 63488 00:12:07.308 }, 00:12:07.308 { 00:12:07.308 "name": "pt3", 00:12:07.308 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:07.308 "is_configured": true, 00:12:07.308 "data_offset": 2048, 00:12:07.308 "data_size": 63488 00:12:07.308 } 00:12:07.308 ] 00:12:07.308 }' 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.308 13:34:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.874 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:07.874 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:07.875 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:07.875 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:07.875 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:07.875 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:07.875 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:07.875 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:08.133 [2024-07-15 13:34:55.504876] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:08.133 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:08.133 "name": "raid_bdev1", 00:12:08.133 "aliases": [ 00:12:08.133 "77fe096a-6c42-4fd6-ba59-cfb35511fb7c" 00:12:08.133 ], 00:12:08.133 "product_name": "Raid Volume", 00:12:08.133 "block_size": 512, 00:12:08.133 "num_blocks": 190464, 00:12:08.133 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:08.133 "assigned_rate_limits": { 00:12:08.133 "rw_ios_per_sec": 0, 00:12:08.133 "rw_mbytes_per_sec": 0, 00:12:08.133 "r_mbytes_per_sec": 0, 00:12:08.133 "w_mbytes_per_sec": 0 00:12:08.133 }, 00:12:08.133 "claimed": false, 00:12:08.133 "zoned": false, 00:12:08.133 "supported_io_types": { 00:12:08.133 "read": true, 00:12:08.133 "write": true, 00:12:08.133 "unmap": true, 00:12:08.133 "flush": true, 00:12:08.133 "reset": true, 00:12:08.133 "nvme_admin": false, 00:12:08.133 "nvme_io": false, 00:12:08.133 "nvme_io_md": false, 00:12:08.133 "write_zeroes": true, 00:12:08.133 "zcopy": false, 00:12:08.133 "get_zone_info": false, 00:12:08.133 "zone_management": false, 00:12:08.133 "zone_append": false, 00:12:08.133 "compare": false, 00:12:08.133 "compare_and_write": false, 00:12:08.133 "abort": false, 00:12:08.133 "seek_hole": false, 00:12:08.133 "seek_data": false, 00:12:08.133 "copy": false, 00:12:08.133 "nvme_iov_md": false 00:12:08.133 }, 00:12:08.133 "memory_domains": [ 00:12:08.133 { 00:12:08.133 "dma_device_id": "system", 00:12:08.133 "dma_device_type": 1 00:12:08.133 }, 00:12:08.133 { 00:12:08.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.133 "dma_device_type": 2 00:12:08.133 }, 00:12:08.133 { 00:12:08.133 "dma_device_id": "system", 00:12:08.133 "dma_device_type": 1 00:12:08.133 }, 00:12:08.133 { 00:12:08.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.133 "dma_device_type": 2 00:12:08.133 }, 00:12:08.133 { 00:12:08.133 "dma_device_id": "system", 00:12:08.133 "dma_device_type": 1 00:12:08.133 }, 00:12:08.133 { 00:12:08.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.133 "dma_device_type": 2 00:12:08.133 } 00:12:08.133 ], 00:12:08.133 "driver_specific": { 00:12:08.133 "raid": { 00:12:08.133 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:08.133 "strip_size_kb": 64, 00:12:08.133 "state": "online", 00:12:08.133 "raid_level": "raid0", 00:12:08.133 "superblock": true, 00:12:08.133 "num_base_bdevs": 3, 00:12:08.133 "num_base_bdevs_discovered": 3, 00:12:08.133 "num_base_bdevs_operational": 3, 00:12:08.133 "base_bdevs_list": [ 00:12:08.134 { 00:12:08.134 "name": "pt1", 00:12:08.134 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:08.134 "is_configured": true, 00:12:08.134 "data_offset": 2048, 00:12:08.134 "data_size": 63488 00:12:08.134 }, 00:12:08.134 { 00:12:08.134 "name": "pt2", 00:12:08.134 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:08.134 "is_configured": true, 00:12:08.134 "data_offset": 2048, 00:12:08.134 "data_size": 63488 00:12:08.134 }, 00:12:08.134 { 00:12:08.134 "name": "pt3", 00:12:08.134 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:08.134 "is_configured": true, 00:12:08.134 "data_offset": 2048, 00:12:08.134 "data_size": 63488 00:12:08.134 } 00:12:08.134 ] 00:12:08.134 } 00:12:08.134 } 00:12:08.134 }' 00:12:08.134 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:08.134 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:08.134 pt2 00:12:08.134 pt3' 00:12:08.134 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.134 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:08.134 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.134 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.134 "name": "pt1", 00:12:08.134 "aliases": [ 00:12:08.134 "00000000-0000-0000-0000-000000000001" 00:12:08.134 ], 00:12:08.134 "product_name": "passthru", 00:12:08.134 "block_size": 512, 00:12:08.134 "num_blocks": 65536, 00:12:08.134 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:08.134 "assigned_rate_limits": { 00:12:08.134 "rw_ios_per_sec": 0, 00:12:08.134 "rw_mbytes_per_sec": 0, 00:12:08.134 "r_mbytes_per_sec": 0, 00:12:08.134 "w_mbytes_per_sec": 0 00:12:08.134 }, 00:12:08.134 "claimed": true, 00:12:08.134 "claim_type": "exclusive_write", 00:12:08.134 "zoned": false, 00:12:08.134 "supported_io_types": { 00:12:08.134 "read": true, 00:12:08.134 "write": true, 00:12:08.134 "unmap": true, 00:12:08.134 "flush": true, 00:12:08.134 "reset": true, 00:12:08.134 "nvme_admin": false, 00:12:08.134 "nvme_io": false, 00:12:08.134 "nvme_io_md": false, 00:12:08.134 "write_zeroes": true, 00:12:08.134 "zcopy": true, 00:12:08.134 "get_zone_info": false, 00:12:08.134 "zone_management": false, 00:12:08.134 "zone_append": false, 00:12:08.134 "compare": false, 00:12:08.134 "compare_and_write": false, 00:12:08.134 "abort": true, 00:12:08.134 "seek_hole": false, 00:12:08.134 "seek_data": false, 00:12:08.134 "copy": true, 00:12:08.134 "nvme_iov_md": false 00:12:08.134 }, 00:12:08.134 "memory_domains": [ 00:12:08.134 { 00:12:08.134 "dma_device_id": "system", 00:12:08.134 "dma_device_type": 1 00:12:08.134 }, 00:12:08.134 { 00:12:08.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.134 "dma_device_type": 2 00:12:08.134 } 00:12:08.134 ], 00:12:08.134 "driver_specific": { 00:12:08.134 "passthru": { 00:12:08.134 "name": "pt1", 00:12:08.134 "base_bdev_name": "malloc1" 00:12:08.134 } 00:12:08.134 } 00:12:08.134 }' 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.393 13:34:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.651 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:08.651 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.651 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:08.651 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.651 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.651 "name": "pt2", 00:12:08.651 "aliases": [ 00:12:08.652 "00000000-0000-0000-0000-000000000002" 00:12:08.652 ], 00:12:08.652 "product_name": "passthru", 00:12:08.652 "block_size": 512, 00:12:08.652 "num_blocks": 65536, 00:12:08.652 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:08.652 "assigned_rate_limits": { 00:12:08.652 "rw_ios_per_sec": 0, 00:12:08.652 "rw_mbytes_per_sec": 0, 00:12:08.652 "r_mbytes_per_sec": 0, 00:12:08.652 "w_mbytes_per_sec": 0 00:12:08.652 }, 00:12:08.652 "claimed": true, 00:12:08.652 "claim_type": "exclusive_write", 00:12:08.652 "zoned": false, 00:12:08.652 "supported_io_types": { 00:12:08.652 "read": true, 00:12:08.652 "write": true, 00:12:08.652 "unmap": true, 00:12:08.652 "flush": true, 00:12:08.652 "reset": true, 00:12:08.652 "nvme_admin": false, 00:12:08.652 "nvme_io": false, 00:12:08.652 "nvme_io_md": false, 00:12:08.652 "write_zeroes": true, 00:12:08.652 "zcopy": true, 00:12:08.652 "get_zone_info": false, 00:12:08.652 "zone_management": false, 00:12:08.652 "zone_append": false, 00:12:08.652 "compare": false, 00:12:08.652 "compare_and_write": false, 00:12:08.652 "abort": true, 00:12:08.652 "seek_hole": false, 00:12:08.652 "seek_data": false, 00:12:08.652 "copy": true, 00:12:08.652 "nvme_iov_md": false 00:12:08.652 }, 00:12:08.652 "memory_domains": [ 00:12:08.652 { 00:12:08.652 "dma_device_id": "system", 00:12:08.652 "dma_device_type": 1 00:12:08.652 }, 00:12:08.652 { 00:12:08.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.652 "dma_device_type": 2 00:12:08.652 } 00:12:08.652 ], 00:12:08.652 "driver_specific": { 00:12:08.652 "passthru": { 00:12:08.652 "name": "pt2", 00:12:08.652 "base_bdev_name": "malloc2" 00:12:08.652 } 00:12:08.652 } 00:12:08.652 }' 00:12:08.652 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.652 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:08.911 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:09.170 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:09.170 "name": "pt3", 00:12:09.170 "aliases": [ 00:12:09.170 "00000000-0000-0000-0000-000000000003" 00:12:09.170 ], 00:12:09.170 "product_name": "passthru", 00:12:09.170 "block_size": 512, 00:12:09.170 "num_blocks": 65536, 00:12:09.170 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:09.170 "assigned_rate_limits": { 00:12:09.170 "rw_ios_per_sec": 0, 00:12:09.170 "rw_mbytes_per_sec": 0, 00:12:09.170 "r_mbytes_per_sec": 0, 00:12:09.170 "w_mbytes_per_sec": 0 00:12:09.170 }, 00:12:09.170 "claimed": true, 00:12:09.170 "claim_type": "exclusive_write", 00:12:09.170 "zoned": false, 00:12:09.170 "supported_io_types": { 00:12:09.170 "read": true, 00:12:09.170 "write": true, 00:12:09.170 "unmap": true, 00:12:09.170 "flush": true, 00:12:09.170 "reset": true, 00:12:09.170 "nvme_admin": false, 00:12:09.170 "nvme_io": false, 00:12:09.170 "nvme_io_md": false, 00:12:09.170 "write_zeroes": true, 00:12:09.170 "zcopy": true, 00:12:09.170 "get_zone_info": false, 00:12:09.170 "zone_management": false, 00:12:09.170 "zone_append": false, 00:12:09.170 "compare": false, 00:12:09.170 "compare_and_write": false, 00:12:09.170 "abort": true, 00:12:09.170 "seek_hole": false, 00:12:09.170 "seek_data": false, 00:12:09.170 "copy": true, 00:12:09.170 "nvme_iov_md": false 00:12:09.170 }, 00:12:09.170 "memory_domains": [ 00:12:09.170 { 00:12:09.170 "dma_device_id": "system", 00:12:09.170 "dma_device_type": 1 00:12:09.170 }, 00:12:09.170 { 00:12:09.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:09.170 "dma_device_type": 2 00:12:09.170 } 00:12:09.170 ], 00:12:09.170 "driver_specific": { 00:12:09.170 "passthru": { 00:12:09.170 "name": "pt3", 00:12:09.170 "base_bdev_name": "malloc3" 00:12:09.170 } 00:12:09.170 } 00:12:09.170 }' 00:12:09.170 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.170 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.170 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:09.170 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:09.429 13:34:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:09.687 [2024-07-15 13:34:57.133079] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:09.687 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=77fe096a-6c42-4fd6-ba59-cfb35511fb7c 00:12:09.687 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 77fe096a-6c42-4fd6-ba59-cfb35511fb7c ']' 00:12:09.687 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:09.946 [2024-07-15 13:34:57.317376] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:09.946 [2024-07-15 13:34:57.317392] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:09.946 [2024-07-15 13:34:57.317424] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:09.946 [2024-07-15 13:34:57.317459] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:09.946 [2024-07-15 13:34:57.317467] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1920e80 name raid_bdev1, state offline 00:12:09.946 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.946 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:09.946 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:09.946 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:09.946 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:09.946 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:10.204 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:10.204 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:10.461 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:10.461 13:34:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:12:10.461 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:10.461 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:10.719 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:10.977 [2024-07-15 13:34:58.376183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:10.977 [2024-07-15 13:34:58.377194] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:10.977 [2024-07-15 13:34:58.377226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:12:10.977 [2024-07-15 13:34:58.377258] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:10.977 [2024-07-15 13:34:58.377286] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:10.977 [2024-07-15 13:34:58.377301] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:12:10.977 [2024-07-15 13:34:58.377313] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:10.977 [2024-07-15 13:34:58.377325] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x191c540 name raid_bdev1, state configuring 00:12:10.977 request: 00:12:10.977 { 00:12:10.977 "name": "raid_bdev1", 00:12:10.977 "raid_level": "raid0", 00:12:10.977 "base_bdevs": [ 00:12:10.977 "malloc1", 00:12:10.977 "malloc2", 00:12:10.977 "malloc3" 00:12:10.977 ], 00:12:10.977 "strip_size_kb": 64, 00:12:10.977 "superblock": false, 00:12:10.977 "method": "bdev_raid_create", 00:12:10.977 "req_id": 1 00:12:10.977 } 00:12:10.977 Got JSON-RPC error response 00:12:10.977 response: 00:12:10.977 { 00:12:10.977 "code": -17, 00:12:10.977 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:10.977 } 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:10.977 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:11.241 [2024-07-15 13:34:58.713019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:11.241 [2024-07-15 13:34:58.713059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:11.241 [2024-07-15 13:34:58.713072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191c9b0 00:12:11.241 [2024-07-15 13:34:58.713081] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:11.241 [2024-07-15 13:34:58.714347] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:11.241 [2024-07-15 13:34:58.714371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:11.241 [2024-07-15 13:34:58.714426] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:11.241 [2024-07-15 13:34:58.714447] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:11.241 pt1 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.241 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:11.499 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:11.499 "name": "raid_bdev1", 00:12:11.499 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:11.499 "strip_size_kb": 64, 00:12:11.499 "state": "configuring", 00:12:11.499 "raid_level": "raid0", 00:12:11.499 "superblock": true, 00:12:11.499 "num_base_bdevs": 3, 00:12:11.499 "num_base_bdevs_discovered": 1, 00:12:11.499 "num_base_bdevs_operational": 3, 00:12:11.499 "base_bdevs_list": [ 00:12:11.499 { 00:12:11.499 "name": "pt1", 00:12:11.499 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:11.499 "is_configured": true, 00:12:11.499 "data_offset": 2048, 00:12:11.499 "data_size": 63488 00:12:11.499 }, 00:12:11.499 { 00:12:11.499 "name": null, 00:12:11.499 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:11.499 "is_configured": false, 00:12:11.499 "data_offset": 2048, 00:12:11.499 "data_size": 63488 00:12:11.499 }, 00:12:11.499 { 00:12:11.499 "name": null, 00:12:11.499 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:11.499 "is_configured": false, 00:12:11.499 "data_offset": 2048, 00:12:11.499 "data_size": 63488 00:12:11.499 } 00:12:11.499 ] 00:12:11.499 }' 00:12:11.499 13:34:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:11.499 13:34:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.064 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:12:12.064 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:12.064 [2024-07-15 13:34:59.543168] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:12.064 [2024-07-15 13:34:59.543207] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:12.064 [2024-07-15 13:34:59.543223] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1772e80 00:12:12.064 [2024-07-15 13:34:59.543232] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:12.064 [2024-07-15 13:34:59.543473] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:12.064 [2024-07-15 13:34:59.543484] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:12.064 [2024-07-15 13:34:59.543530] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:12.064 [2024-07-15 13:34:59.543543] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:12.064 pt2 00:12:12.064 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:12.322 [2024-07-15 13:34:59.715622] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.322 "name": "raid_bdev1", 00:12:12.322 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:12.322 "strip_size_kb": 64, 00:12:12.322 "state": "configuring", 00:12:12.322 "raid_level": "raid0", 00:12:12.322 "superblock": true, 00:12:12.322 "num_base_bdevs": 3, 00:12:12.322 "num_base_bdevs_discovered": 1, 00:12:12.322 "num_base_bdevs_operational": 3, 00:12:12.322 "base_bdevs_list": [ 00:12:12.322 { 00:12:12.322 "name": "pt1", 00:12:12.322 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:12.322 "is_configured": true, 00:12:12.322 "data_offset": 2048, 00:12:12.322 "data_size": 63488 00:12:12.322 }, 00:12:12.322 { 00:12:12.322 "name": null, 00:12:12.322 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:12.322 "is_configured": false, 00:12:12.322 "data_offset": 2048, 00:12:12.322 "data_size": 63488 00:12:12.322 }, 00:12:12.322 { 00:12:12.322 "name": null, 00:12:12.322 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:12.322 "is_configured": false, 00:12:12.322 "data_offset": 2048, 00:12:12.322 "data_size": 63488 00:12:12.322 } 00:12:12.322 ] 00:12:12.322 }' 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.322 13:34:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.887 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:12.887 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:12.887 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:13.145 [2024-07-15 13:35:00.561809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:13.145 [2024-07-15 13:35:00.561849] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.145 [2024-07-15 13:35:00.561863] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1772630 00:12:13.145 [2024-07-15 13:35:00.561873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.145 [2024-07-15 13:35:00.562137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.145 [2024-07-15 13:35:00.562150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:13.145 [2024-07-15 13:35:00.562198] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:13.145 [2024-07-15 13:35:00.562211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:13.145 pt2 00:12:13.145 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:13.145 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:13.145 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:13.145 [2024-07-15 13:35:00.750293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:13.145 [2024-07-15 13:35:00.750323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.145 [2024-07-15 13:35:00.750333] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191eb10 00:12:13.145 [2024-07-15 13:35:00.750342] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.145 [2024-07-15 13:35:00.750545] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.145 [2024-07-15 13:35:00.750556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:13.145 [2024-07-15 13:35:00.750593] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:12:13.145 [2024-07-15 13:35:00.750605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:13.145 [2024-07-15 13:35:00.750675] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1923500 00:12:13.145 [2024-07-15 13:35:00.750681] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:13.145 [2024-07-15 13:35:00.750782] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17ed750 00:12:13.145 [2024-07-15 13:35:00.750862] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1923500 00:12:13.145 [2024-07-15 13:35:00.750868] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1923500 00:12:13.145 [2024-07-15 13:35:00.750928] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:13.145 pt3 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.403 "name": "raid_bdev1", 00:12:13.403 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:13.403 "strip_size_kb": 64, 00:12:13.403 "state": "online", 00:12:13.403 "raid_level": "raid0", 00:12:13.403 "superblock": true, 00:12:13.403 "num_base_bdevs": 3, 00:12:13.403 "num_base_bdevs_discovered": 3, 00:12:13.403 "num_base_bdevs_operational": 3, 00:12:13.403 "base_bdevs_list": [ 00:12:13.403 { 00:12:13.403 "name": "pt1", 00:12:13.403 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:13.403 "is_configured": true, 00:12:13.403 "data_offset": 2048, 00:12:13.403 "data_size": 63488 00:12:13.403 }, 00:12:13.403 { 00:12:13.403 "name": "pt2", 00:12:13.403 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:13.403 "is_configured": true, 00:12:13.403 "data_offset": 2048, 00:12:13.403 "data_size": 63488 00:12:13.403 }, 00:12:13.403 { 00:12:13.403 "name": "pt3", 00:12:13.403 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:13.403 "is_configured": true, 00:12:13.403 "data_offset": 2048, 00:12:13.403 "data_size": 63488 00:12:13.403 } 00:12:13.403 ] 00:12:13.403 }' 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.403 13:35:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:13.968 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:13.968 [2024-07-15 13:35:01.584624] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:14.225 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:14.225 "name": "raid_bdev1", 00:12:14.225 "aliases": [ 00:12:14.225 "77fe096a-6c42-4fd6-ba59-cfb35511fb7c" 00:12:14.225 ], 00:12:14.225 "product_name": "Raid Volume", 00:12:14.225 "block_size": 512, 00:12:14.225 "num_blocks": 190464, 00:12:14.225 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:14.225 "assigned_rate_limits": { 00:12:14.225 "rw_ios_per_sec": 0, 00:12:14.225 "rw_mbytes_per_sec": 0, 00:12:14.225 "r_mbytes_per_sec": 0, 00:12:14.225 "w_mbytes_per_sec": 0 00:12:14.225 }, 00:12:14.225 "claimed": false, 00:12:14.225 "zoned": false, 00:12:14.225 "supported_io_types": { 00:12:14.225 "read": true, 00:12:14.226 "write": true, 00:12:14.226 "unmap": true, 00:12:14.226 "flush": true, 00:12:14.226 "reset": true, 00:12:14.226 "nvme_admin": false, 00:12:14.226 "nvme_io": false, 00:12:14.226 "nvme_io_md": false, 00:12:14.226 "write_zeroes": true, 00:12:14.226 "zcopy": false, 00:12:14.226 "get_zone_info": false, 00:12:14.226 "zone_management": false, 00:12:14.226 "zone_append": false, 00:12:14.226 "compare": false, 00:12:14.226 "compare_and_write": false, 00:12:14.226 "abort": false, 00:12:14.226 "seek_hole": false, 00:12:14.226 "seek_data": false, 00:12:14.226 "copy": false, 00:12:14.226 "nvme_iov_md": false 00:12:14.226 }, 00:12:14.226 "memory_domains": [ 00:12:14.226 { 00:12:14.226 "dma_device_id": "system", 00:12:14.226 "dma_device_type": 1 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.226 "dma_device_type": 2 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "dma_device_id": "system", 00:12:14.226 "dma_device_type": 1 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.226 "dma_device_type": 2 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "dma_device_id": "system", 00:12:14.226 "dma_device_type": 1 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.226 "dma_device_type": 2 00:12:14.226 } 00:12:14.226 ], 00:12:14.226 "driver_specific": { 00:12:14.226 "raid": { 00:12:14.226 "uuid": "77fe096a-6c42-4fd6-ba59-cfb35511fb7c", 00:12:14.226 "strip_size_kb": 64, 00:12:14.226 "state": "online", 00:12:14.226 "raid_level": "raid0", 00:12:14.226 "superblock": true, 00:12:14.226 "num_base_bdevs": 3, 00:12:14.226 "num_base_bdevs_discovered": 3, 00:12:14.226 "num_base_bdevs_operational": 3, 00:12:14.226 "base_bdevs_list": [ 00:12:14.226 { 00:12:14.226 "name": "pt1", 00:12:14.226 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:14.226 "is_configured": true, 00:12:14.226 "data_offset": 2048, 00:12:14.226 "data_size": 63488 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "name": "pt2", 00:12:14.226 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:14.226 "is_configured": true, 00:12:14.226 "data_offset": 2048, 00:12:14.226 "data_size": 63488 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "name": "pt3", 00:12:14.226 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:14.226 "is_configured": true, 00:12:14.226 "data_offset": 2048, 00:12:14.226 "data_size": 63488 00:12:14.226 } 00:12:14.226 ] 00:12:14.226 } 00:12:14.226 } 00:12:14.226 }' 00:12:14.226 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:14.226 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:14.226 pt2 00:12:14.226 pt3' 00:12:14.226 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:14.226 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:14.226 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:14.226 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:14.226 "name": "pt1", 00:12:14.226 "aliases": [ 00:12:14.226 "00000000-0000-0000-0000-000000000001" 00:12:14.226 ], 00:12:14.226 "product_name": "passthru", 00:12:14.226 "block_size": 512, 00:12:14.226 "num_blocks": 65536, 00:12:14.226 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:14.226 "assigned_rate_limits": { 00:12:14.226 "rw_ios_per_sec": 0, 00:12:14.226 "rw_mbytes_per_sec": 0, 00:12:14.226 "r_mbytes_per_sec": 0, 00:12:14.226 "w_mbytes_per_sec": 0 00:12:14.226 }, 00:12:14.226 "claimed": true, 00:12:14.226 "claim_type": "exclusive_write", 00:12:14.226 "zoned": false, 00:12:14.226 "supported_io_types": { 00:12:14.226 "read": true, 00:12:14.226 "write": true, 00:12:14.226 "unmap": true, 00:12:14.226 "flush": true, 00:12:14.226 "reset": true, 00:12:14.226 "nvme_admin": false, 00:12:14.226 "nvme_io": false, 00:12:14.226 "nvme_io_md": false, 00:12:14.226 "write_zeroes": true, 00:12:14.226 "zcopy": true, 00:12:14.226 "get_zone_info": false, 00:12:14.226 "zone_management": false, 00:12:14.226 "zone_append": false, 00:12:14.226 "compare": false, 00:12:14.226 "compare_and_write": false, 00:12:14.226 "abort": true, 00:12:14.226 "seek_hole": false, 00:12:14.226 "seek_data": false, 00:12:14.226 "copy": true, 00:12:14.226 "nvme_iov_md": false 00:12:14.226 }, 00:12:14.226 "memory_domains": [ 00:12:14.226 { 00:12:14.226 "dma_device_id": "system", 00:12:14.226 "dma_device_type": 1 00:12:14.226 }, 00:12:14.226 { 00:12:14.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.226 "dma_device_type": 2 00:12:14.226 } 00:12:14.226 ], 00:12:14.226 "driver_specific": { 00:12:14.226 "passthru": { 00:12:14.226 "name": "pt1", 00:12:14.226 "base_bdev_name": "malloc1" 00:12:14.226 } 00:12:14.226 } 00:12:14.226 }' 00:12:14.226 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.483 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.483 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:14.483 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.483 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.483 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:14.483 13:35:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.483 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.483 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:14.483 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:14.741 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:14.741 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:14.741 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:14.741 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:14.741 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:14.741 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:14.741 "name": "pt2", 00:12:14.741 "aliases": [ 00:12:14.741 "00000000-0000-0000-0000-000000000002" 00:12:14.741 ], 00:12:14.741 "product_name": "passthru", 00:12:14.741 "block_size": 512, 00:12:14.741 "num_blocks": 65536, 00:12:14.741 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:14.741 "assigned_rate_limits": { 00:12:14.741 "rw_ios_per_sec": 0, 00:12:14.741 "rw_mbytes_per_sec": 0, 00:12:14.741 "r_mbytes_per_sec": 0, 00:12:14.741 "w_mbytes_per_sec": 0 00:12:14.741 }, 00:12:14.741 "claimed": true, 00:12:14.741 "claim_type": "exclusive_write", 00:12:14.741 "zoned": false, 00:12:14.741 "supported_io_types": { 00:12:14.741 "read": true, 00:12:14.741 "write": true, 00:12:14.741 "unmap": true, 00:12:14.741 "flush": true, 00:12:14.741 "reset": true, 00:12:14.741 "nvme_admin": false, 00:12:14.741 "nvme_io": false, 00:12:14.741 "nvme_io_md": false, 00:12:14.741 "write_zeroes": true, 00:12:14.741 "zcopy": true, 00:12:14.741 "get_zone_info": false, 00:12:14.741 "zone_management": false, 00:12:14.741 "zone_append": false, 00:12:14.741 "compare": false, 00:12:14.741 "compare_and_write": false, 00:12:14.741 "abort": true, 00:12:14.741 "seek_hole": false, 00:12:14.741 "seek_data": false, 00:12:14.741 "copy": true, 00:12:14.741 "nvme_iov_md": false 00:12:14.741 }, 00:12:14.741 "memory_domains": [ 00:12:14.741 { 00:12:14.741 "dma_device_id": "system", 00:12:14.741 "dma_device_type": 1 00:12:14.741 }, 00:12:14.741 { 00:12:14.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.741 "dma_device_type": 2 00:12:14.741 } 00:12:14.741 ], 00:12:14.741 "driver_specific": { 00:12:14.741 "passthru": { 00:12:14.741 "name": "pt2", 00:12:14.741 "base_bdev_name": "malloc2" 00:12:14.741 } 00:12:14.741 } 00:12:14.741 }' 00:12:14.741 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:14.999 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:15.256 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:15.256 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:15.256 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:15.256 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:15.256 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:15.256 "name": "pt3", 00:12:15.256 "aliases": [ 00:12:15.256 "00000000-0000-0000-0000-000000000003" 00:12:15.256 ], 00:12:15.256 "product_name": "passthru", 00:12:15.256 "block_size": 512, 00:12:15.256 "num_blocks": 65536, 00:12:15.256 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:15.256 "assigned_rate_limits": { 00:12:15.256 "rw_ios_per_sec": 0, 00:12:15.256 "rw_mbytes_per_sec": 0, 00:12:15.256 "r_mbytes_per_sec": 0, 00:12:15.256 "w_mbytes_per_sec": 0 00:12:15.256 }, 00:12:15.256 "claimed": true, 00:12:15.256 "claim_type": "exclusive_write", 00:12:15.256 "zoned": false, 00:12:15.256 "supported_io_types": { 00:12:15.256 "read": true, 00:12:15.256 "write": true, 00:12:15.256 "unmap": true, 00:12:15.256 "flush": true, 00:12:15.256 "reset": true, 00:12:15.256 "nvme_admin": false, 00:12:15.256 "nvme_io": false, 00:12:15.256 "nvme_io_md": false, 00:12:15.256 "write_zeroes": true, 00:12:15.256 "zcopy": true, 00:12:15.256 "get_zone_info": false, 00:12:15.256 "zone_management": false, 00:12:15.256 "zone_append": false, 00:12:15.256 "compare": false, 00:12:15.256 "compare_and_write": false, 00:12:15.256 "abort": true, 00:12:15.256 "seek_hole": false, 00:12:15.256 "seek_data": false, 00:12:15.256 "copy": true, 00:12:15.256 "nvme_iov_md": false 00:12:15.256 }, 00:12:15.256 "memory_domains": [ 00:12:15.256 { 00:12:15.256 "dma_device_id": "system", 00:12:15.256 "dma_device_type": 1 00:12:15.256 }, 00:12:15.256 { 00:12:15.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.256 "dma_device_type": 2 00:12:15.256 } 00:12:15.256 ], 00:12:15.256 "driver_specific": { 00:12:15.256 "passthru": { 00:12:15.256 "name": "pt3", 00:12:15.256 "base_bdev_name": "malloc3" 00:12:15.256 } 00:12:15.256 } 00:12:15.256 }' 00:12:15.256 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:15.256 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:15.515 13:35:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:15.515 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:15.515 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:15.515 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:15.515 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:15.777 [2024-07-15 13:35:03.236935] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 77fe096a-6c42-4fd6-ba59-cfb35511fb7c '!=' 77fe096a-6c42-4fd6-ba59-cfb35511fb7c ']' 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4189205 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4189205 ']' 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4189205 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4189205 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4189205' 00:12:15.777 killing process with pid 4189205 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4189205 00:12:15.777 [2024-07-15 13:35:03.288707] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:15.777 [2024-07-15 13:35:03.288747] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:15.777 [2024-07-15 13:35:03.288784] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:15.777 [2024-07-15 13:35:03.288792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1923500 name raid_bdev1, state offline 00:12:15.777 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4189205 00:12:15.777 [2024-07-15 13:35:03.314226] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:16.038 13:35:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:16.038 00:12:16.038 real 0m10.925s 00:12:16.038 user 0m19.453s 00:12:16.038 sys 0m2.159s 00:12:16.038 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:16.038 13:35:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.038 ************************************ 00:12:16.038 END TEST raid_superblock_test 00:12:16.038 ************************************ 00:12:16.038 13:35:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:16.038 13:35:03 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:12:16.038 13:35:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:16.038 13:35:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:16.038 13:35:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:16.038 ************************************ 00:12:16.038 START TEST raid_read_error_test 00:12:16.038 ************************************ 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.1pC7QesLO2 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4191003 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4191003 /var/tmp/spdk-raid.sock 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4191003 ']' 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:16.038 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.038 13:35:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:16.038 [2024-07-15 13:35:03.643344] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:12:16.038 [2024-07-15 13:35:03.643394] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4191003 ] 00:12:16.295 [2024-07-15 13:35:03.731461] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:16.295 [2024-07-15 13:35:03.822487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:16.295 [2024-07-15 13:35:03.880330] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:16.295 [2024-07-15 13:35:03.880354] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:16.860 13:35:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:16.860 13:35:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:16.860 13:35:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:16.860 13:35:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:17.117 BaseBdev1_malloc 00:12:17.117 13:35:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:17.373 true 00:12:17.373 13:35:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:17.373 [2024-07-15 13:35:04.961367] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:17.373 [2024-07-15 13:35:04.961404] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:17.373 [2024-07-15 13:35:04.961419] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd0990 00:12:17.373 [2024-07-15 13:35:04.961428] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:17.373 [2024-07-15 13:35:04.962837] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:17.373 [2024-07-15 13:35:04.962861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:17.373 BaseBdev1 00:12:17.373 13:35:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:17.373 13:35:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:17.630 BaseBdev2_malloc 00:12:17.630 13:35:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:17.887 true 00:12:17.887 13:35:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:17.887 [2024-07-15 13:35:05.494402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:17.887 [2024-07-15 13:35:05.494436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:17.887 [2024-07-15 13:35:05.494468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd51d0 00:12:17.887 [2024-07-15 13:35:05.494477] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:17.887 [2024-07-15 13:35:05.495642] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:17.887 [2024-07-15 13:35:05.495667] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:17.887 BaseBdev2 00:12:18.145 13:35:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:18.145 13:35:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:18.145 BaseBdev3_malloc 00:12:18.145 13:35:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:18.403 true 00:12:18.403 13:35:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:18.661 [2024-07-15 13:35:06.036693] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:18.661 [2024-07-15 13:35:06.036727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:18.661 [2024-07-15 13:35:06.036756] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd7490 00:12:18.661 [2024-07-15 13:35:06.036765] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:18.661 [2024-07-15 13:35:06.037905] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:18.661 [2024-07-15 13:35:06.037928] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:18.661 BaseBdev3 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:18.661 [2024-07-15 13:35:06.197133] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:18.661 [2024-07-15 13:35:06.198062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:18.661 [2024-07-15 13:35:06.198110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:18.661 [2024-07-15 13:35:06.198252] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfd8b40 00:12:18.661 [2024-07-15 13:35:06.198259] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:18.661 [2024-07-15 13:35:06.198392] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd86e0 00:12:18.661 [2024-07-15 13:35:06.198493] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfd8b40 00:12:18.661 [2024-07-15 13:35:06.198499] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfd8b40 00:12:18.661 [2024-07-15 13:35:06.198570] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.661 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:18.918 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.918 "name": "raid_bdev1", 00:12:18.918 "uuid": "885433b6-2636-4e7a-b8c3-3533f0e30334", 00:12:18.918 "strip_size_kb": 64, 00:12:18.918 "state": "online", 00:12:18.918 "raid_level": "raid0", 00:12:18.918 "superblock": true, 00:12:18.918 "num_base_bdevs": 3, 00:12:18.918 "num_base_bdevs_discovered": 3, 00:12:18.918 "num_base_bdevs_operational": 3, 00:12:18.918 "base_bdevs_list": [ 00:12:18.918 { 00:12:18.918 "name": "BaseBdev1", 00:12:18.918 "uuid": "d06b840e-d7ce-5933-b038-9f6f1b5433e5", 00:12:18.918 "is_configured": true, 00:12:18.918 "data_offset": 2048, 00:12:18.918 "data_size": 63488 00:12:18.918 }, 00:12:18.918 { 00:12:18.918 "name": "BaseBdev2", 00:12:18.918 "uuid": "691aa20a-f980-50a2-85ff-9894cbad3030", 00:12:18.918 "is_configured": true, 00:12:18.918 "data_offset": 2048, 00:12:18.918 "data_size": 63488 00:12:18.918 }, 00:12:18.918 { 00:12:18.918 "name": "BaseBdev3", 00:12:18.918 "uuid": "222d6c3b-88f5-52a6-8978-1ba4f5fb1350", 00:12:18.918 "is_configured": true, 00:12:18.918 "data_offset": 2048, 00:12:18.918 "data_size": 63488 00:12:18.918 } 00:12:18.918 ] 00:12:18.918 }' 00:12:18.918 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.918 13:35:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.483 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:19.483 13:35:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:19.483 [2024-07-15 13:35:06.951440] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe26e70 00:12:20.417 13:35:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:20.674 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.675 "name": "raid_bdev1", 00:12:20.675 "uuid": "885433b6-2636-4e7a-b8c3-3533f0e30334", 00:12:20.675 "strip_size_kb": 64, 00:12:20.675 "state": "online", 00:12:20.675 "raid_level": "raid0", 00:12:20.675 "superblock": true, 00:12:20.675 "num_base_bdevs": 3, 00:12:20.675 "num_base_bdevs_discovered": 3, 00:12:20.675 "num_base_bdevs_operational": 3, 00:12:20.675 "base_bdevs_list": [ 00:12:20.675 { 00:12:20.675 "name": "BaseBdev1", 00:12:20.675 "uuid": "d06b840e-d7ce-5933-b038-9f6f1b5433e5", 00:12:20.675 "is_configured": true, 00:12:20.675 "data_offset": 2048, 00:12:20.675 "data_size": 63488 00:12:20.675 }, 00:12:20.675 { 00:12:20.675 "name": "BaseBdev2", 00:12:20.675 "uuid": "691aa20a-f980-50a2-85ff-9894cbad3030", 00:12:20.675 "is_configured": true, 00:12:20.675 "data_offset": 2048, 00:12:20.675 "data_size": 63488 00:12:20.675 }, 00:12:20.675 { 00:12:20.675 "name": "BaseBdev3", 00:12:20.675 "uuid": "222d6c3b-88f5-52a6-8978-1ba4f5fb1350", 00:12:20.675 "is_configured": true, 00:12:20.675 "data_offset": 2048, 00:12:20.675 "data_size": 63488 00:12:20.675 } 00:12:20.675 ] 00:12:20.675 }' 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.675 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.257 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:21.523 [2024-07-15 13:35:08.892631] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:21.523 [2024-07-15 13:35:08.892662] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:21.523 [2024-07-15 13:35:08.894774] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:21.523 [2024-07-15 13:35:08.894800] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:21.523 [2024-07-15 13:35:08.894824] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:21.523 [2024-07-15 13:35:08.894832] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd8b40 name raid_bdev1, state offline 00:12:21.523 0 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4191003 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4191003 ']' 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4191003 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4191003 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4191003' 00:12:21.523 killing process with pid 4191003 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4191003 00:12:21.523 [2024-07-15 13:35:08.959041] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:21.523 13:35:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4191003 00:12:21.523 [2024-07-15 13:35:08.978689] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.1pC7QesLO2 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:21.782 00:12:21.782 real 0m5.602s 00:12:21.782 user 0m8.554s 00:12:21.782 sys 0m1.010s 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:21.782 13:35:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.782 ************************************ 00:12:21.782 END TEST raid_read_error_test 00:12:21.782 ************************************ 00:12:21.782 13:35:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:21.782 13:35:09 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:12:21.782 13:35:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:21.782 13:35:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:21.782 13:35:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:21.782 ************************************ 00:12:21.782 START TEST raid_write_error_test 00:12:21.782 ************************************ 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.bP1IVLfllU 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4191820 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4191820 /var/tmp/spdk-raid.sock 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4191820 ']' 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:21.782 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:21.782 13:35:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.782 [2024-07-15 13:35:09.325502] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:12:21.782 [2024-07-15 13:35:09.325553] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4191820 ] 00:12:22.040 [2024-07-15 13:35:09.412766] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:22.040 [2024-07-15 13:35:09.504258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:22.040 [2024-07-15 13:35:09.565039] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:22.040 [2024-07-15 13:35:09.565068] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:22.606 13:35:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:22.606 13:35:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:22.606 13:35:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:22.606 13:35:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:22.862 BaseBdev1_malloc 00:12:22.862 13:35:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:22.862 true 00:12:23.119 13:35:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:23.119 [2024-07-15 13:35:10.634557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:23.119 [2024-07-15 13:35:10.634592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:23.119 [2024-07-15 13:35:10.634623] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27c5990 00:12:23.119 [2024-07-15 13:35:10.634631] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:23.119 [2024-07-15 13:35:10.636014] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:23.119 [2024-07-15 13:35:10.636037] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:23.119 BaseBdev1 00:12:23.119 13:35:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:23.119 13:35:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:23.376 BaseBdev2_malloc 00:12:23.376 13:35:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:23.633 true 00:12:23.633 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:23.633 [2024-07-15 13:35:11.164840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:23.633 [2024-07-15 13:35:11.164876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:23.633 [2024-07-15 13:35:11.164892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27ca1d0 00:12:23.633 [2024-07-15 13:35:11.164901] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:23.633 [2024-07-15 13:35:11.166085] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:23.633 [2024-07-15 13:35:11.166108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:23.633 BaseBdev2 00:12:23.633 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:23.633 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:23.890 BaseBdev3_malloc 00:12:23.890 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:23.890 true 00:12:24.147 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:24.147 [2024-07-15 13:35:11.679132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:24.147 [2024-07-15 13:35:11.679166] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:24.147 [2024-07-15 13:35:11.679181] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27cc490 00:12:24.147 [2024-07-15 13:35:11.679189] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:24.147 [2024-07-15 13:35:11.680335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:24.147 [2024-07-15 13:35:11.680358] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:24.147 BaseBdev3 00:12:24.147 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:24.404 [2024-07-15 13:35:11.851605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:24.404 [2024-07-15 13:35:11.852580] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:24.404 [2024-07-15 13:35:11.852628] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:24.404 [2024-07-15 13:35:11.852768] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27cdb40 00:12:24.404 [2024-07-15 13:35:11.852775] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:24.404 [2024-07-15 13:35:11.852914] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27cd6e0 00:12:24.404 [2024-07-15 13:35:11.853023] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27cdb40 00:12:24.404 [2024-07-15 13:35:11.853029] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27cdb40 00:12:24.404 [2024-07-15 13:35:11.853101] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.404 13:35:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:24.687 13:35:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.687 "name": "raid_bdev1", 00:12:24.687 "uuid": "80d29fc9-0bc3-48b0-9673-7563526cd606", 00:12:24.687 "strip_size_kb": 64, 00:12:24.687 "state": "online", 00:12:24.687 "raid_level": "raid0", 00:12:24.687 "superblock": true, 00:12:24.687 "num_base_bdevs": 3, 00:12:24.687 "num_base_bdevs_discovered": 3, 00:12:24.687 "num_base_bdevs_operational": 3, 00:12:24.687 "base_bdevs_list": [ 00:12:24.687 { 00:12:24.687 "name": "BaseBdev1", 00:12:24.687 "uuid": "a00052ee-caa9-516a-b081-759bb6b8c91d", 00:12:24.687 "is_configured": true, 00:12:24.687 "data_offset": 2048, 00:12:24.687 "data_size": 63488 00:12:24.687 }, 00:12:24.687 { 00:12:24.687 "name": "BaseBdev2", 00:12:24.687 "uuid": "eb79d79b-b61f-5af3-a0b7-264aaca9f91e", 00:12:24.687 "is_configured": true, 00:12:24.687 "data_offset": 2048, 00:12:24.687 "data_size": 63488 00:12:24.687 }, 00:12:24.687 { 00:12:24.687 "name": "BaseBdev3", 00:12:24.687 "uuid": "17756884-25d1-55d9-b092-fc73e79c0b2a", 00:12:24.687 "is_configured": true, 00:12:24.687 "data_offset": 2048, 00:12:24.687 "data_size": 63488 00:12:24.687 } 00:12:24.687 ] 00:12:24.687 }' 00:12:24.687 13:35:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.687 13:35:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.944 13:35:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:24.944 13:35:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:25.201 [2024-07-15 13:35:12.613779] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x261be70 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.132 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:26.390 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.390 "name": "raid_bdev1", 00:12:26.390 "uuid": "80d29fc9-0bc3-48b0-9673-7563526cd606", 00:12:26.390 "strip_size_kb": 64, 00:12:26.390 "state": "online", 00:12:26.390 "raid_level": "raid0", 00:12:26.390 "superblock": true, 00:12:26.390 "num_base_bdevs": 3, 00:12:26.390 "num_base_bdevs_discovered": 3, 00:12:26.390 "num_base_bdevs_operational": 3, 00:12:26.390 "base_bdevs_list": [ 00:12:26.390 { 00:12:26.390 "name": "BaseBdev1", 00:12:26.390 "uuid": "a00052ee-caa9-516a-b081-759bb6b8c91d", 00:12:26.390 "is_configured": true, 00:12:26.390 "data_offset": 2048, 00:12:26.390 "data_size": 63488 00:12:26.390 }, 00:12:26.390 { 00:12:26.390 "name": "BaseBdev2", 00:12:26.390 "uuid": "eb79d79b-b61f-5af3-a0b7-264aaca9f91e", 00:12:26.390 "is_configured": true, 00:12:26.390 "data_offset": 2048, 00:12:26.390 "data_size": 63488 00:12:26.390 }, 00:12:26.390 { 00:12:26.390 "name": "BaseBdev3", 00:12:26.390 "uuid": "17756884-25d1-55d9-b092-fc73e79c0b2a", 00:12:26.390 "is_configured": true, 00:12:26.390 "data_offset": 2048, 00:12:26.390 "data_size": 63488 00:12:26.390 } 00:12:26.390 ] 00:12:26.390 }' 00:12:26.390 13:35:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.390 13:35:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.957 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:26.957 [2024-07-15 13:35:14.554698] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:26.957 [2024-07-15 13:35:14.554727] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:26.957 [2024-07-15 13:35:14.556829] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:26.957 [2024-07-15 13:35:14.556856] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.957 [2024-07-15 13:35:14.556887] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:26.957 [2024-07-15 13:35:14.556895] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27cdb40 name raid_bdev1, state offline 00:12:26.957 0 00:12:26.957 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4191820 00:12:26.957 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4191820 ']' 00:12:26.957 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4191820 00:12:26.957 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:27.216 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:27.216 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4191820 00:12:27.216 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:27.216 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:27.216 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4191820' 00:12:27.216 killing process with pid 4191820 00:12:27.216 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4191820 00:12:27.216 [2024-07-15 13:35:14.618429] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:27.216 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4191820 00:12:27.216 [2024-07-15 13:35:14.638384] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.bP1IVLfllU 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:27.475 00:12:27.475 real 0m5.593s 00:12:27.475 user 0m8.556s 00:12:27.475 sys 0m0.981s 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:27.475 13:35:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.475 ************************************ 00:12:27.475 END TEST raid_write_error_test 00:12:27.475 ************************************ 00:12:27.475 13:35:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:27.475 13:35:14 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:27.475 13:35:14 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:12:27.475 13:35:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:27.475 13:35:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:27.475 13:35:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:27.475 ************************************ 00:12:27.475 START TEST raid_state_function_test 00:12:27.475 ************************************ 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:27.475 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4192624 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4192624' 00:12:27.476 Process raid pid: 4192624 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4192624 /var/tmp/spdk-raid.sock 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4192624 ']' 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:27.476 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:27.476 13:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.476 [2024-07-15 13:35:15.007073] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:12:27.476 [2024-07-15 13:35:15.007134] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:27.735 [2024-07-15 13:35:15.096878] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:27.735 [2024-07-15 13:35:15.182816] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:27.735 [2024-07-15 13:35:15.237554] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:27.735 [2024-07-15 13:35:15.237583] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:28.314 13:35:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:28.314 13:35:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:28.314 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:28.572 [2024-07-15 13:35:15.960722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:28.572 [2024-07-15 13:35:15.960753] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:28.572 [2024-07-15 13:35:15.960760] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:28.572 [2024-07-15 13:35:15.960784] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:28.572 [2024-07-15 13:35:15.960790] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:28.572 [2024-07-15 13:35:15.960797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.572 13:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.572 13:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.572 "name": "Existed_Raid", 00:12:28.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.572 "strip_size_kb": 64, 00:12:28.572 "state": "configuring", 00:12:28.572 "raid_level": "concat", 00:12:28.572 "superblock": false, 00:12:28.572 "num_base_bdevs": 3, 00:12:28.572 "num_base_bdevs_discovered": 0, 00:12:28.572 "num_base_bdevs_operational": 3, 00:12:28.572 "base_bdevs_list": [ 00:12:28.572 { 00:12:28.572 "name": "BaseBdev1", 00:12:28.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.573 "is_configured": false, 00:12:28.573 "data_offset": 0, 00:12:28.573 "data_size": 0 00:12:28.573 }, 00:12:28.573 { 00:12:28.573 "name": "BaseBdev2", 00:12:28.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.573 "is_configured": false, 00:12:28.573 "data_offset": 0, 00:12:28.573 "data_size": 0 00:12:28.573 }, 00:12:28.573 { 00:12:28.573 "name": "BaseBdev3", 00:12:28.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.573 "is_configured": false, 00:12:28.573 "data_offset": 0, 00:12:28.573 "data_size": 0 00:12:28.573 } 00:12:28.573 ] 00:12:28.573 }' 00:12:28.573 13:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.573 13:35:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.140 13:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:29.400 [2024-07-15 13:35:16.830865] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:29.400 [2024-07-15 13:35:16.830888] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2235f50 name Existed_Raid, state configuring 00:12:29.400 13:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:29.400 [2024-07-15 13:35:17.015375] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:29.400 [2024-07-15 13:35:17.015397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:29.400 [2024-07-15 13:35:17.015403] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:29.400 [2024-07-15 13:35:17.015411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:29.400 [2024-07-15 13:35:17.015416] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:29.400 [2024-07-15 13:35:17.015424] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:29.659 [2024-07-15 13:35:17.196522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:29.659 BaseBdev1 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:29.659 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:29.917 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:30.176 [ 00:12:30.176 { 00:12:30.176 "name": "BaseBdev1", 00:12:30.176 "aliases": [ 00:12:30.176 "1209b369-5c0b-458a-a6f8-0862f0ddf77d" 00:12:30.176 ], 00:12:30.176 "product_name": "Malloc disk", 00:12:30.176 "block_size": 512, 00:12:30.176 "num_blocks": 65536, 00:12:30.176 "uuid": "1209b369-5c0b-458a-a6f8-0862f0ddf77d", 00:12:30.176 "assigned_rate_limits": { 00:12:30.176 "rw_ios_per_sec": 0, 00:12:30.176 "rw_mbytes_per_sec": 0, 00:12:30.176 "r_mbytes_per_sec": 0, 00:12:30.176 "w_mbytes_per_sec": 0 00:12:30.176 }, 00:12:30.176 "claimed": true, 00:12:30.176 "claim_type": "exclusive_write", 00:12:30.176 "zoned": false, 00:12:30.176 "supported_io_types": { 00:12:30.176 "read": true, 00:12:30.176 "write": true, 00:12:30.176 "unmap": true, 00:12:30.176 "flush": true, 00:12:30.176 "reset": true, 00:12:30.176 "nvme_admin": false, 00:12:30.176 "nvme_io": false, 00:12:30.176 "nvme_io_md": false, 00:12:30.176 "write_zeroes": true, 00:12:30.176 "zcopy": true, 00:12:30.176 "get_zone_info": false, 00:12:30.176 "zone_management": false, 00:12:30.176 "zone_append": false, 00:12:30.176 "compare": false, 00:12:30.176 "compare_and_write": false, 00:12:30.176 "abort": true, 00:12:30.176 "seek_hole": false, 00:12:30.176 "seek_data": false, 00:12:30.176 "copy": true, 00:12:30.176 "nvme_iov_md": false 00:12:30.176 }, 00:12:30.176 "memory_domains": [ 00:12:30.176 { 00:12:30.176 "dma_device_id": "system", 00:12:30.176 "dma_device_type": 1 00:12:30.176 }, 00:12:30.176 { 00:12:30.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.176 "dma_device_type": 2 00:12:30.176 } 00:12:30.176 ], 00:12:30.176 "driver_specific": {} 00:12:30.176 } 00:12:30.176 ] 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.176 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.176 "name": "Existed_Raid", 00:12:30.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.176 "strip_size_kb": 64, 00:12:30.176 "state": "configuring", 00:12:30.176 "raid_level": "concat", 00:12:30.176 "superblock": false, 00:12:30.176 "num_base_bdevs": 3, 00:12:30.176 "num_base_bdevs_discovered": 1, 00:12:30.176 "num_base_bdevs_operational": 3, 00:12:30.176 "base_bdevs_list": [ 00:12:30.176 { 00:12:30.176 "name": "BaseBdev1", 00:12:30.176 "uuid": "1209b369-5c0b-458a-a6f8-0862f0ddf77d", 00:12:30.176 "is_configured": true, 00:12:30.176 "data_offset": 0, 00:12:30.176 "data_size": 65536 00:12:30.176 }, 00:12:30.176 { 00:12:30.177 "name": "BaseBdev2", 00:12:30.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.177 "is_configured": false, 00:12:30.177 "data_offset": 0, 00:12:30.177 "data_size": 0 00:12:30.177 }, 00:12:30.177 { 00:12:30.177 "name": "BaseBdev3", 00:12:30.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.177 "is_configured": false, 00:12:30.177 "data_offset": 0, 00:12:30.177 "data_size": 0 00:12:30.177 } 00:12:30.177 ] 00:12:30.177 }' 00:12:30.177 13:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.177 13:35:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.744 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:31.003 [2024-07-15 13:35:18.379622] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:31.003 [2024-07-15 13:35:18.379662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2235820 name Existed_Raid, state configuring 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:31.003 [2024-07-15 13:35:18.556116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:31.003 [2024-07-15 13:35:18.557273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:31.003 [2024-07-15 13:35:18.557304] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:31.003 [2024-07-15 13:35:18.557311] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:31.003 [2024-07-15 13:35:18.557319] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.003 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:31.261 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.261 "name": "Existed_Raid", 00:12:31.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.261 "strip_size_kb": 64, 00:12:31.261 "state": "configuring", 00:12:31.261 "raid_level": "concat", 00:12:31.261 "superblock": false, 00:12:31.261 "num_base_bdevs": 3, 00:12:31.261 "num_base_bdevs_discovered": 1, 00:12:31.261 "num_base_bdevs_operational": 3, 00:12:31.261 "base_bdevs_list": [ 00:12:31.261 { 00:12:31.261 "name": "BaseBdev1", 00:12:31.261 "uuid": "1209b369-5c0b-458a-a6f8-0862f0ddf77d", 00:12:31.261 "is_configured": true, 00:12:31.261 "data_offset": 0, 00:12:31.261 "data_size": 65536 00:12:31.261 }, 00:12:31.261 { 00:12:31.261 "name": "BaseBdev2", 00:12:31.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.261 "is_configured": false, 00:12:31.262 "data_offset": 0, 00:12:31.262 "data_size": 0 00:12:31.262 }, 00:12:31.262 { 00:12:31.262 "name": "BaseBdev3", 00:12:31.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.262 "is_configured": false, 00:12:31.262 "data_offset": 0, 00:12:31.262 "data_size": 0 00:12:31.262 } 00:12:31.262 ] 00:12:31.262 }' 00:12:31.262 13:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.262 13:35:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:31.829 [2024-07-15 13:35:19.422120] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:31.829 BaseBdev2 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:31.829 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:32.087 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:32.346 [ 00:12:32.346 { 00:12:32.346 "name": "BaseBdev2", 00:12:32.346 "aliases": [ 00:12:32.346 "b621a474-26bf-49cb-8415-74ab7f2839e9" 00:12:32.346 ], 00:12:32.346 "product_name": "Malloc disk", 00:12:32.346 "block_size": 512, 00:12:32.346 "num_blocks": 65536, 00:12:32.346 "uuid": "b621a474-26bf-49cb-8415-74ab7f2839e9", 00:12:32.346 "assigned_rate_limits": { 00:12:32.346 "rw_ios_per_sec": 0, 00:12:32.346 "rw_mbytes_per_sec": 0, 00:12:32.346 "r_mbytes_per_sec": 0, 00:12:32.346 "w_mbytes_per_sec": 0 00:12:32.346 }, 00:12:32.346 "claimed": true, 00:12:32.346 "claim_type": "exclusive_write", 00:12:32.346 "zoned": false, 00:12:32.346 "supported_io_types": { 00:12:32.346 "read": true, 00:12:32.346 "write": true, 00:12:32.346 "unmap": true, 00:12:32.346 "flush": true, 00:12:32.346 "reset": true, 00:12:32.346 "nvme_admin": false, 00:12:32.346 "nvme_io": false, 00:12:32.346 "nvme_io_md": false, 00:12:32.346 "write_zeroes": true, 00:12:32.346 "zcopy": true, 00:12:32.346 "get_zone_info": false, 00:12:32.346 "zone_management": false, 00:12:32.346 "zone_append": false, 00:12:32.346 "compare": false, 00:12:32.346 "compare_and_write": false, 00:12:32.346 "abort": true, 00:12:32.346 "seek_hole": false, 00:12:32.346 "seek_data": false, 00:12:32.346 "copy": true, 00:12:32.346 "nvme_iov_md": false 00:12:32.346 }, 00:12:32.346 "memory_domains": [ 00:12:32.346 { 00:12:32.346 "dma_device_id": "system", 00:12:32.346 "dma_device_type": 1 00:12:32.346 }, 00:12:32.346 { 00:12:32.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.346 "dma_device_type": 2 00:12:32.346 } 00:12:32.346 ], 00:12:32.346 "driver_specific": {} 00:12:32.346 } 00:12:32.346 ] 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.346 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.604 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.604 "name": "Existed_Raid", 00:12:32.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.604 "strip_size_kb": 64, 00:12:32.604 "state": "configuring", 00:12:32.604 "raid_level": "concat", 00:12:32.604 "superblock": false, 00:12:32.604 "num_base_bdevs": 3, 00:12:32.604 "num_base_bdevs_discovered": 2, 00:12:32.604 "num_base_bdevs_operational": 3, 00:12:32.604 "base_bdevs_list": [ 00:12:32.604 { 00:12:32.604 "name": "BaseBdev1", 00:12:32.604 "uuid": "1209b369-5c0b-458a-a6f8-0862f0ddf77d", 00:12:32.604 "is_configured": true, 00:12:32.604 "data_offset": 0, 00:12:32.604 "data_size": 65536 00:12:32.604 }, 00:12:32.604 { 00:12:32.604 "name": "BaseBdev2", 00:12:32.604 "uuid": "b621a474-26bf-49cb-8415-74ab7f2839e9", 00:12:32.604 "is_configured": true, 00:12:32.604 "data_offset": 0, 00:12:32.604 "data_size": 65536 00:12:32.604 }, 00:12:32.604 { 00:12:32.604 "name": "BaseBdev3", 00:12:32.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.604 "is_configured": false, 00:12:32.604 "data_offset": 0, 00:12:32.604 "data_size": 0 00:12:32.604 } 00:12:32.604 ] 00:12:32.604 }' 00:12:32.604 13:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.604 13:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:33.170 [2024-07-15 13:35:20.640242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:33.170 [2024-07-15 13:35:20.640284] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2236710 00:12:33.170 [2024-07-15 13:35:20.640290] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:33.170 [2024-07-15 13:35:20.640432] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22363e0 00:12:33.170 [2024-07-15 13:35:20.640523] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2236710 00:12:33.170 [2024-07-15 13:35:20.640530] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2236710 00:12:33.170 [2024-07-15 13:35:20.640660] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:33.170 BaseBdev3 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:33.170 13:35:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:33.429 13:35:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:33.429 [ 00:12:33.429 { 00:12:33.429 "name": "BaseBdev3", 00:12:33.429 "aliases": [ 00:12:33.429 "cf221c37-d13b-4c05-bd31-8e41dc5d0105" 00:12:33.429 ], 00:12:33.429 "product_name": "Malloc disk", 00:12:33.429 "block_size": 512, 00:12:33.429 "num_blocks": 65536, 00:12:33.429 "uuid": "cf221c37-d13b-4c05-bd31-8e41dc5d0105", 00:12:33.429 "assigned_rate_limits": { 00:12:33.429 "rw_ios_per_sec": 0, 00:12:33.429 "rw_mbytes_per_sec": 0, 00:12:33.429 "r_mbytes_per_sec": 0, 00:12:33.429 "w_mbytes_per_sec": 0 00:12:33.429 }, 00:12:33.429 "claimed": true, 00:12:33.429 "claim_type": "exclusive_write", 00:12:33.429 "zoned": false, 00:12:33.429 "supported_io_types": { 00:12:33.429 "read": true, 00:12:33.429 "write": true, 00:12:33.429 "unmap": true, 00:12:33.429 "flush": true, 00:12:33.429 "reset": true, 00:12:33.429 "nvme_admin": false, 00:12:33.429 "nvme_io": false, 00:12:33.429 "nvme_io_md": false, 00:12:33.429 "write_zeroes": true, 00:12:33.429 "zcopy": true, 00:12:33.429 "get_zone_info": false, 00:12:33.429 "zone_management": false, 00:12:33.429 "zone_append": false, 00:12:33.429 "compare": false, 00:12:33.429 "compare_and_write": false, 00:12:33.429 "abort": true, 00:12:33.429 "seek_hole": false, 00:12:33.429 "seek_data": false, 00:12:33.429 "copy": true, 00:12:33.429 "nvme_iov_md": false 00:12:33.429 }, 00:12:33.429 "memory_domains": [ 00:12:33.429 { 00:12:33.429 "dma_device_id": "system", 00:12:33.429 "dma_device_type": 1 00:12:33.429 }, 00:12:33.429 { 00:12:33.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.429 "dma_device_type": 2 00:12:33.429 } 00:12:33.429 ], 00:12:33.429 "driver_specific": {} 00:12:33.429 } 00:12:33.429 ] 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.429 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.687 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.687 "name": "Existed_Raid", 00:12:33.687 "uuid": "196747a6-7cc9-44db-8c7d-4726742c94a2", 00:12:33.687 "strip_size_kb": 64, 00:12:33.687 "state": "online", 00:12:33.687 "raid_level": "concat", 00:12:33.687 "superblock": false, 00:12:33.687 "num_base_bdevs": 3, 00:12:33.687 "num_base_bdevs_discovered": 3, 00:12:33.687 "num_base_bdevs_operational": 3, 00:12:33.687 "base_bdevs_list": [ 00:12:33.687 { 00:12:33.687 "name": "BaseBdev1", 00:12:33.687 "uuid": "1209b369-5c0b-458a-a6f8-0862f0ddf77d", 00:12:33.687 "is_configured": true, 00:12:33.687 "data_offset": 0, 00:12:33.687 "data_size": 65536 00:12:33.687 }, 00:12:33.687 { 00:12:33.687 "name": "BaseBdev2", 00:12:33.687 "uuid": "b621a474-26bf-49cb-8415-74ab7f2839e9", 00:12:33.687 "is_configured": true, 00:12:33.687 "data_offset": 0, 00:12:33.687 "data_size": 65536 00:12:33.687 }, 00:12:33.687 { 00:12:33.687 "name": "BaseBdev3", 00:12:33.687 "uuid": "cf221c37-d13b-4c05-bd31-8e41dc5d0105", 00:12:33.687 "is_configured": true, 00:12:33.687 "data_offset": 0, 00:12:33.687 "data_size": 65536 00:12:33.687 } 00:12:33.687 ] 00:12:33.687 }' 00:12:33.687 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.687 13:35:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:34.254 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:34.254 [2024-07-15 13:35:21.855567] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:34.537 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:34.537 "name": "Existed_Raid", 00:12:34.537 "aliases": [ 00:12:34.537 "196747a6-7cc9-44db-8c7d-4726742c94a2" 00:12:34.537 ], 00:12:34.537 "product_name": "Raid Volume", 00:12:34.537 "block_size": 512, 00:12:34.537 "num_blocks": 196608, 00:12:34.537 "uuid": "196747a6-7cc9-44db-8c7d-4726742c94a2", 00:12:34.537 "assigned_rate_limits": { 00:12:34.537 "rw_ios_per_sec": 0, 00:12:34.537 "rw_mbytes_per_sec": 0, 00:12:34.537 "r_mbytes_per_sec": 0, 00:12:34.537 "w_mbytes_per_sec": 0 00:12:34.537 }, 00:12:34.537 "claimed": false, 00:12:34.537 "zoned": false, 00:12:34.537 "supported_io_types": { 00:12:34.537 "read": true, 00:12:34.537 "write": true, 00:12:34.537 "unmap": true, 00:12:34.537 "flush": true, 00:12:34.537 "reset": true, 00:12:34.537 "nvme_admin": false, 00:12:34.537 "nvme_io": false, 00:12:34.537 "nvme_io_md": false, 00:12:34.537 "write_zeroes": true, 00:12:34.537 "zcopy": false, 00:12:34.537 "get_zone_info": false, 00:12:34.537 "zone_management": false, 00:12:34.537 "zone_append": false, 00:12:34.537 "compare": false, 00:12:34.537 "compare_and_write": false, 00:12:34.537 "abort": false, 00:12:34.537 "seek_hole": false, 00:12:34.537 "seek_data": false, 00:12:34.537 "copy": false, 00:12:34.537 "nvme_iov_md": false 00:12:34.537 }, 00:12:34.537 "memory_domains": [ 00:12:34.537 { 00:12:34.537 "dma_device_id": "system", 00:12:34.537 "dma_device_type": 1 00:12:34.537 }, 00:12:34.537 { 00:12:34.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.537 "dma_device_type": 2 00:12:34.537 }, 00:12:34.537 { 00:12:34.537 "dma_device_id": "system", 00:12:34.537 "dma_device_type": 1 00:12:34.537 }, 00:12:34.537 { 00:12:34.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.537 "dma_device_type": 2 00:12:34.537 }, 00:12:34.537 { 00:12:34.537 "dma_device_id": "system", 00:12:34.537 "dma_device_type": 1 00:12:34.537 }, 00:12:34.537 { 00:12:34.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.537 "dma_device_type": 2 00:12:34.537 } 00:12:34.537 ], 00:12:34.537 "driver_specific": { 00:12:34.537 "raid": { 00:12:34.537 "uuid": "196747a6-7cc9-44db-8c7d-4726742c94a2", 00:12:34.537 "strip_size_kb": 64, 00:12:34.537 "state": "online", 00:12:34.537 "raid_level": "concat", 00:12:34.537 "superblock": false, 00:12:34.537 "num_base_bdevs": 3, 00:12:34.537 "num_base_bdevs_discovered": 3, 00:12:34.537 "num_base_bdevs_operational": 3, 00:12:34.537 "base_bdevs_list": [ 00:12:34.537 { 00:12:34.537 "name": "BaseBdev1", 00:12:34.537 "uuid": "1209b369-5c0b-458a-a6f8-0862f0ddf77d", 00:12:34.537 "is_configured": true, 00:12:34.537 "data_offset": 0, 00:12:34.537 "data_size": 65536 00:12:34.537 }, 00:12:34.537 { 00:12:34.537 "name": "BaseBdev2", 00:12:34.537 "uuid": "b621a474-26bf-49cb-8415-74ab7f2839e9", 00:12:34.537 "is_configured": true, 00:12:34.537 "data_offset": 0, 00:12:34.537 "data_size": 65536 00:12:34.537 }, 00:12:34.537 { 00:12:34.537 "name": "BaseBdev3", 00:12:34.537 "uuid": "cf221c37-d13b-4c05-bd31-8e41dc5d0105", 00:12:34.537 "is_configured": true, 00:12:34.537 "data_offset": 0, 00:12:34.537 "data_size": 65536 00:12:34.537 } 00:12:34.537 ] 00:12:34.537 } 00:12:34.537 } 00:12:34.537 }' 00:12:34.537 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:34.537 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:34.537 BaseBdev2 00:12:34.537 BaseBdev3' 00:12:34.537 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:34.537 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:34.537 13:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:34.538 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:34.538 "name": "BaseBdev1", 00:12:34.538 "aliases": [ 00:12:34.538 "1209b369-5c0b-458a-a6f8-0862f0ddf77d" 00:12:34.538 ], 00:12:34.538 "product_name": "Malloc disk", 00:12:34.538 "block_size": 512, 00:12:34.538 "num_blocks": 65536, 00:12:34.538 "uuid": "1209b369-5c0b-458a-a6f8-0862f0ddf77d", 00:12:34.538 "assigned_rate_limits": { 00:12:34.538 "rw_ios_per_sec": 0, 00:12:34.538 "rw_mbytes_per_sec": 0, 00:12:34.538 "r_mbytes_per_sec": 0, 00:12:34.538 "w_mbytes_per_sec": 0 00:12:34.538 }, 00:12:34.538 "claimed": true, 00:12:34.538 "claim_type": "exclusive_write", 00:12:34.538 "zoned": false, 00:12:34.538 "supported_io_types": { 00:12:34.538 "read": true, 00:12:34.538 "write": true, 00:12:34.538 "unmap": true, 00:12:34.538 "flush": true, 00:12:34.538 "reset": true, 00:12:34.538 "nvme_admin": false, 00:12:34.538 "nvme_io": false, 00:12:34.538 "nvme_io_md": false, 00:12:34.538 "write_zeroes": true, 00:12:34.538 "zcopy": true, 00:12:34.538 "get_zone_info": false, 00:12:34.538 "zone_management": false, 00:12:34.538 "zone_append": false, 00:12:34.538 "compare": false, 00:12:34.538 "compare_and_write": false, 00:12:34.538 "abort": true, 00:12:34.538 "seek_hole": false, 00:12:34.538 "seek_data": false, 00:12:34.538 "copy": true, 00:12:34.538 "nvme_iov_md": false 00:12:34.538 }, 00:12:34.538 "memory_domains": [ 00:12:34.538 { 00:12:34.538 "dma_device_id": "system", 00:12:34.538 "dma_device_type": 1 00:12:34.538 }, 00:12:34.538 { 00:12:34.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.538 "dma_device_type": 2 00:12:34.538 } 00:12:34.538 ], 00:12:34.538 "driver_specific": {} 00:12:34.538 }' 00:12:34.538 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.538 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:34.795 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.053 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.053 "name": "BaseBdev2", 00:12:35.053 "aliases": [ 00:12:35.053 "b621a474-26bf-49cb-8415-74ab7f2839e9" 00:12:35.053 ], 00:12:35.053 "product_name": "Malloc disk", 00:12:35.053 "block_size": 512, 00:12:35.053 "num_blocks": 65536, 00:12:35.053 "uuid": "b621a474-26bf-49cb-8415-74ab7f2839e9", 00:12:35.053 "assigned_rate_limits": { 00:12:35.053 "rw_ios_per_sec": 0, 00:12:35.053 "rw_mbytes_per_sec": 0, 00:12:35.053 "r_mbytes_per_sec": 0, 00:12:35.053 "w_mbytes_per_sec": 0 00:12:35.053 }, 00:12:35.053 "claimed": true, 00:12:35.053 "claim_type": "exclusive_write", 00:12:35.053 "zoned": false, 00:12:35.053 "supported_io_types": { 00:12:35.053 "read": true, 00:12:35.053 "write": true, 00:12:35.053 "unmap": true, 00:12:35.053 "flush": true, 00:12:35.053 "reset": true, 00:12:35.053 "nvme_admin": false, 00:12:35.053 "nvme_io": false, 00:12:35.053 "nvme_io_md": false, 00:12:35.053 "write_zeroes": true, 00:12:35.053 "zcopy": true, 00:12:35.053 "get_zone_info": false, 00:12:35.053 "zone_management": false, 00:12:35.053 "zone_append": false, 00:12:35.053 "compare": false, 00:12:35.053 "compare_and_write": false, 00:12:35.053 "abort": true, 00:12:35.053 "seek_hole": false, 00:12:35.053 "seek_data": false, 00:12:35.053 "copy": true, 00:12:35.053 "nvme_iov_md": false 00:12:35.053 }, 00:12:35.053 "memory_domains": [ 00:12:35.053 { 00:12:35.053 "dma_device_id": "system", 00:12:35.053 "dma_device_type": 1 00:12:35.053 }, 00:12:35.053 { 00:12:35.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.053 "dma_device_type": 2 00:12:35.053 } 00:12:35.053 ], 00:12:35.053 "driver_specific": {} 00:12:35.053 }' 00:12:35.053 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.053 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.053 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.053 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:35.313 13:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.571 "name": "BaseBdev3", 00:12:35.571 "aliases": [ 00:12:35.571 "cf221c37-d13b-4c05-bd31-8e41dc5d0105" 00:12:35.571 ], 00:12:35.571 "product_name": "Malloc disk", 00:12:35.571 "block_size": 512, 00:12:35.571 "num_blocks": 65536, 00:12:35.571 "uuid": "cf221c37-d13b-4c05-bd31-8e41dc5d0105", 00:12:35.571 "assigned_rate_limits": { 00:12:35.571 "rw_ios_per_sec": 0, 00:12:35.571 "rw_mbytes_per_sec": 0, 00:12:35.571 "r_mbytes_per_sec": 0, 00:12:35.571 "w_mbytes_per_sec": 0 00:12:35.571 }, 00:12:35.571 "claimed": true, 00:12:35.571 "claim_type": "exclusive_write", 00:12:35.571 "zoned": false, 00:12:35.571 "supported_io_types": { 00:12:35.571 "read": true, 00:12:35.571 "write": true, 00:12:35.571 "unmap": true, 00:12:35.571 "flush": true, 00:12:35.571 "reset": true, 00:12:35.571 "nvme_admin": false, 00:12:35.571 "nvme_io": false, 00:12:35.571 "nvme_io_md": false, 00:12:35.571 "write_zeroes": true, 00:12:35.571 "zcopy": true, 00:12:35.571 "get_zone_info": false, 00:12:35.571 "zone_management": false, 00:12:35.571 "zone_append": false, 00:12:35.571 "compare": false, 00:12:35.571 "compare_and_write": false, 00:12:35.571 "abort": true, 00:12:35.571 "seek_hole": false, 00:12:35.571 "seek_data": false, 00:12:35.571 "copy": true, 00:12:35.571 "nvme_iov_md": false 00:12:35.571 }, 00:12:35.571 "memory_domains": [ 00:12:35.571 { 00:12:35.571 "dma_device_id": "system", 00:12:35.571 "dma_device_type": 1 00:12:35.571 }, 00:12:35.571 { 00:12:35.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.571 "dma_device_type": 2 00:12:35.571 } 00:12:35.571 ], 00:12:35.571 "driver_specific": {} 00:12:35.571 }' 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.571 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.828 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.828 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.828 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.828 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.828 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.828 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:36.087 [2024-07-15 13:35:23.479622] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:36.087 [2024-07-15 13:35:23.479648] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:36.087 [2024-07-15 13:35:23.479678] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.087 "name": "Existed_Raid", 00:12:36.087 "uuid": "196747a6-7cc9-44db-8c7d-4726742c94a2", 00:12:36.087 "strip_size_kb": 64, 00:12:36.087 "state": "offline", 00:12:36.087 "raid_level": "concat", 00:12:36.087 "superblock": false, 00:12:36.087 "num_base_bdevs": 3, 00:12:36.087 "num_base_bdevs_discovered": 2, 00:12:36.087 "num_base_bdevs_operational": 2, 00:12:36.087 "base_bdevs_list": [ 00:12:36.087 { 00:12:36.087 "name": null, 00:12:36.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.087 "is_configured": false, 00:12:36.087 "data_offset": 0, 00:12:36.087 "data_size": 65536 00:12:36.087 }, 00:12:36.087 { 00:12:36.087 "name": "BaseBdev2", 00:12:36.087 "uuid": "b621a474-26bf-49cb-8415-74ab7f2839e9", 00:12:36.087 "is_configured": true, 00:12:36.087 "data_offset": 0, 00:12:36.087 "data_size": 65536 00:12:36.087 }, 00:12:36.087 { 00:12:36.087 "name": "BaseBdev3", 00:12:36.087 "uuid": "cf221c37-d13b-4c05-bd31-8e41dc5d0105", 00:12:36.087 "is_configured": true, 00:12:36.087 "data_offset": 0, 00:12:36.087 "data_size": 65536 00:12:36.087 } 00:12:36.087 ] 00:12:36.087 }' 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.087 13:35:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.654 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:36.654 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:36.654 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:36.654 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.913 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:36.913 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:36.913 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:36.913 [2024-07-15 13:35:24.511295] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:37.171 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:37.171 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:37.171 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.171 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:37.171 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:37.171 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:37.171 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:37.430 [2024-07-15 13:35:24.855859] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:37.430 [2024-07-15 13:35:24.855895] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2236710 name Existed_Raid, state offline 00:12:37.430 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:37.430 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:37.430 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.430 13:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:37.688 BaseBdev2 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:37.688 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.947 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:37.947 [ 00:12:37.947 { 00:12:37.947 "name": "BaseBdev2", 00:12:37.947 "aliases": [ 00:12:37.947 "058a2a87-2295-44e6-8f80-a5f2c8f329a0" 00:12:37.947 ], 00:12:37.947 "product_name": "Malloc disk", 00:12:37.947 "block_size": 512, 00:12:37.947 "num_blocks": 65536, 00:12:37.947 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:37.947 "assigned_rate_limits": { 00:12:37.947 "rw_ios_per_sec": 0, 00:12:37.947 "rw_mbytes_per_sec": 0, 00:12:37.947 "r_mbytes_per_sec": 0, 00:12:37.947 "w_mbytes_per_sec": 0 00:12:37.947 }, 00:12:37.947 "claimed": false, 00:12:37.947 "zoned": false, 00:12:37.947 "supported_io_types": { 00:12:37.947 "read": true, 00:12:37.947 "write": true, 00:12:37.947 "unmap": true, 00:12:37.947 "flush": true, 00:12:37.947 "reset": true, 00:12:37.947 "nvme_admin": false, 00:12:37.947 "nvme_io": false, 00:12:37.947 "nvme_io_md": false, 00:12:37.947 "write_zeroes": true, 00:12:37.947 "zcopy": true, 00:12:37.947 "get_zone_info": false, 00:12:37.947 "zone_management": false, 00:12:37.947 "zone_append": false, 00:12:37.947 "compare": false, 00:12:37.947 "compare_and_write": false, 00:12:37.947 "abort": true, 00:12:37.947 "seek_hole": false, 00:12:37.947 "seek_data": false, 00:12:37.947 "copy": true, 00:12:37.947 "nvme_iov_md": false 00:12:37.947 }, 00:12:37.947 "memory_domains": [ 00:12:37.947 { 00:12:37.947 "dma_device_id": "system", 00:12:37.948 "dma_device_type": 1 00:12:37.948 }, 00:12:37.948 { 00:12:37.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.948 "dma_device_type": 2 00:12:37.948 } 00:12:37.948 ], 00:12:37.948 "driver_specific": {} 00:12:37.948 } 00:12:37.948 ] 00:12:37.948 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:37.948 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:37.948 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:37.948 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:38.206 BaseBdev3 00:12:38.206 13:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:38.206 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:38.206 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:38.206 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:38.206 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:38.206 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:38.206 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:38.465 13:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:38.465 [ 00:12:38.465 { 00:12:38.465 "name": "BaseBdev3", 00:12:38.465 "aliases": [ 00:12:38.465 "cfd8e6ff-6675-48b6-9883-c13c112a581d" 00:12:38.465 ], 00:12:38.465 "product_name": "Malloc disk", 00:12:38.465 "block_size": 512, 00:12:38.465 "num_blocks": 65536, 00:12:38.465 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:38.465 "assigned_rate_limits": { 00:12:38.465 "rw_ios_per_sec": 0, 00:12:38.465 "rw_mbytes_per_sec": 0, 00:12:38.465 "r_mbytes_per_sec": 0, 00:12:38.465 "w_mbytes_per_sec": 0 00:12:38.465 }, 00:12:38.465 "claimed": false, 00:12:38.465 "zoned": false, 00:12:38.465 "supported_io_types": { 00:12:38.465 "read": true, 00:12:38.465 "write": true, 00:12:38.465 "unmap": true, 00:12:38.465 "flush": true, 00:12:38.465 "reset": true, 00:12:38.465 "nvme_admin": false, 00:12:38.465 "nvme_io": false, 00:12:38.465 "nvme_io_md": false, 00:12:38.465 "write_zeroes": true, 00:12:38.465 "zcopy": true, 00:12:38.465 "get_zone_info": false, 00:12:38.465 "zone_management": false, 00:12:38.465 "zone_append": false, 00:12:38.465 "compare": false, 00:12:38.465 "compare_and_write": false, 00:12:38.465 "abort": true, 00:12:38.465 "seek_hole": false, 00:12:38.465 "seek_data": false, 00:12:38.465 "copy": true, 00:12:38.465 "nvme_iov_md": false 00:12:38.465 }, 00:12:38.465 "memory_domains": [ 00:12:38.465 { 00:12:38.465 "dma_device_id": "system", 00:12:38.465 "dma_device_type": 1 00:12:38.465 }, 00:12:38.465 { 00:12:38.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.466 "dma_device_type": 2 00:12:38.466 } 00:12:38.466 ], 00:12:38.466 "driver_specific": {} 00:12:38.466 } 00:12:38.466 ] 00:12:38.466 13:35:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:38.466 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:38.466 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:38.466 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:38.724 [2024-07-15 13:35:26.210423] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:38.724 [2024-07-15 13:35:26.210465] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:38.724 [2024-07-15 13:35:26.210478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:38.724 [2024-07-15 13:35:26.211513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.724 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.982 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.982 "name": "Existed_Raid", 00:12:38.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.982 "strip_size_kb": 64, 00:12:38.982 "state": "configuring", 00:12:38.982 "raid_level": "concat", 00:12:38.982 "superblock": false, 00:12:38.982 "num_base_bdevs": 3, 00:12:38.982 "num_base_bdevs_discovered": 2, 00:12:38.982 "num_base_bdevs_operational": 3, 00:12:38.982 "base_bdevs_list": [ 00:12:38.982 { 00:12:38.982 "name": "BaseBdev1", 00:12:38.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.982 "is_configured": false, 00:12:38.982 "data_offset": 0, 00:12:38.982 "data_size": 0 00:12:38.982 }, 00:12:38.982 { 00:12:38.982 "name": "BaseBdev2", 00:12:38.982 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:38.982 "is_configured": true, 00:12:38.982 "data_offset": 0, 00:12:38.982 "data_size": 65536 00:12:38.983 }, 00:12:38.983 { 00:12:38.983 "name": "BaseBdev3", 00:12:38.983 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:38.983 "is_configured": true, 00:12:38.983 "data_offset": 0, 00:12:38.983 "data_size": 65536 00:12:38.983 } 00:12:38.983 ] 00:12:38.983 }' 00:12:38.983 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.983 13:35:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.550 13:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:39.550 [2024-07-15 13:35:27.028500] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.550 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.809 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.809 "name": "Existed_Raid", 00:12:39.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.809 "strip_size_kb": 64, 00:12:39.809 "state": "configuring", 00:12:39.809 "raid_level": "concat", 00:12:39.809 "superblock": false, 00:12:39.809 "num_base_bdevs": 3, 00:12:39.809 "num_base_bdevs_discovered": 1, 00:12:39.809 "num_base_bdevs_operational": 3, 00:12:39.809 "base_bdevs_list": [ 00:12:39.809 { 00:12:39.809 "name": "BaseBdev1", 00:12:39.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.809 "is_configured": false, 00:12:39.809 "data_offset": 0, 00:12:39.809 "data_size": 0 00:12:39.809 }, 00:12:39.809 { 00:12:39.809 "name": null, 00:12:39.809 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:39.809 "is_configured": false, 00:12:39.809 "data_offset": 0, 00:12:39.809 "data_size": 65536 00:12:39.809 }, 00:12:39.809 { 00:12:39.809 "name": "BaseBdev3", 00:12:39.809 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:39.809 "is_configured": true, 00:12:39.809 "data_offset": 0, 00:12:39.809 "data_size": 65536 00:12:39.809 } 00:12:39.809 ] 00:12:39.809 }' 00:12:39.809 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.809 13:35:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.377 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.377 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:40.377 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:40.377 13:35:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:40.636 [2024-07-15 13:35:28.034221] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:40.636 BaseBdev1 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:40.636 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:40.896 [ 00:12:40.896 { 00:12:40.896 "name": "BaseBdev1", 00:12:40.896 "aliases": [ 00:12:40.896 "209bcc63-ad0f-4614-9bb3-38b5273709aa" 00:12:40.896 ], 00:12:40.896 "product_name": "Malloc disk", 00:12:40.896 "block_size": 512, 00:12:40.896 "num_blocks": 65536, 00:12:40.896 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:40.896 "assigned_rate_limits": { 00:12:40.896 "rw_ios_per_sec": 0, 00:12:40.896 "rw_mbytes_per_sec": 0, 00:12:40.896 "r_mbytes_per_sec": 0, 00:12:40.896 "w_mbytes_per_sec": 0 00:12:40.896 }, 00:12:40.896 "claimed": true, 00:12:40.896 "claim_type": "exclusive_write", 00:12:40.896 "zoned": false, 00:12:40.896 "supported_io_types": { 00:12:40.896 "read": true, 00:12:40.896 "write": true, 00:12:40.896 "unmap": true, 00:12:40.896 "flush": true, 00:12:40.896 "reset": true, 00:12:40.896 "nvme_admin": false, 00:12:40.896 "nvme_io": false, 00:12:40.896 "nvme_io_md": false, 00:12:40.896 "write_zeroes": true, 00:12:40.896 "zcopy": true, 00:12:40.896 "get_zone_info": false, 00:12:40.896 "zone_management": false, 00:12:40.896 "zone_append": false, 00:12:40.896 "compare": false, 00:12:40.896 "compare_and_write": false, 00:12:40.896 "abort": true, 00:12:40.896 "seek_hole": false, 00:12:40.896 "seek_data": false, 00:12:40.896 "copy": true, 00:12:40.896 "nvme_iov_md": false 00:12:40.896 }, 00:12:40.896 "memory_domains": [ 00:12:40.896 { 00:12:40.896 "dma_device_id": "system", 00:12:40.896 "dma_device_type": 1 00:12:40.896 }, 00:12:40.896 { 00:12:40.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.896 "dma_device_type": 2 00:12:40.896 } 00:12:40.896 ], 00:12:40.896 "driver_specific": {} 00:12:40.896 } 00:12:40.896 ] 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:40.896 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.154 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.154 "name": "Existed_Raid", 00:12:41.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.154 "strip_size_kb": 64, 00:12:41.154 "state": "configuring", 00:12:41.154 "raid_level": "concat", 00:12:41.154 "superblock": false, 00:12:41.154 "num_base_bdevs": 3, 00:12:41.154 "num_base_bdevs_discovered": 2, 00:12:41.154 "num_base_bdevs_operational": 3, 00:12:41.154 "base_bdevs_list": [ 00:12:41.154 { 00:12:41.154 "name": "BaseBdev1", 00:12:41.154 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:41.154 "is_configured": true, 00:12:41.154 "data_offset": 0, 00:12:41.154 "data_size": 65536 00:12:41.154 }, 00:12:41.154 { 00:12:41.154 "name": null, 00:12:41.154 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:41.154 "is_configured": false, 00:12:41.154 "data_offset": 0, 00:12:41.154 "data_size": 65536 00:12:41.154 }, 00:12:41.154 { 00:12:41.154 "name": "BaseBdev3", 00:12:41.154 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:41.154 "is_configured": true, 00:12:41.154 "data_offset": 0, 00:12:41.154 "data_size": 65536 00:12:41.154 } 00:12:41.154 ] 00:12:41.154 }' 00:12:41.154 13:35:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.154 13:35:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.413 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:41.413 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.672 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:41.672 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:41.932 [2024-07-15 13:35:29.333573] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.932 "name": "Existed_Raid", 00:12:41.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.932 "strip_size_kb": 64, 00:12:41.932 "state": "configuring", 00:12:41.932 "raid_level": "concat", 00:12:41.932 "superblock": false, 00:12:41.932 "num_base_bdevs": 3, 00:12:41.932 "num_base_bdevs_discovered": 1, 00:12:41.932 "num_base_bdevs_operational": 3, 00:12:41.932 "base_bdevs_list": [ 00:12:41.932 { 00:12:41.932 "name": "BaseBdev1", 00:12:41.932 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:41.932 "is_configured": true, 00:12:41.932 "data_offset": 0, 00:12:41.932 "data_size": 65536 00:12:41.932 }, 00:12:41.932 { 00:12:41.932 "name": null, 00:12:41.932 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:41.932 "is_configured": false, 00:12:41.932 "data_offset": 0, 00:12:41.932 "data_size": 65536 00:12:41.932 }, 00:12:41.932 { 00:12:41.932 "name": null, 00:12:41.932 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:41.932 "is_configured": false, 00:12:41.932 "data_offset": 0, 00:12:41.932 "data_size": 65536 00:12:41.932 } 00:12:41.932 ] 00:12:41.932 }' 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.932 13:35:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.499 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.499 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:42.757 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:42.757 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:43.016 [2024-07-15 13:35:30.392334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.016 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.016 "name": "Existed_Raid", 00:12:43.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.016 "strip_size_kb": 64, 00:12:43.016 "state": "configuring", 00:12:43.016 "raid_level": "concat", 00:12:43.016 "superblock": false, 00:12:43.016 "num_base_bdevs": 3, 00:12:43.016 "num_base_bdevs_discovered": 2, 00:12:43.016 "num_base_bdevs_operational": 3, 00:12:43.016 "base_bdevs_list": [ 00:12:43.016 { 00:12:43.016 "name": "BaseBdev1", 00:12:43.016 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:43.016 "is_configured": true, 00:12:43.016 "data_offset": 0, 00:12:43.016 "data_size": 65536 00:12:43.016 }, 00:12:43.016 { 00:12:43.016 "name": null, 00:12:43.016 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:43.016 "is_configured": false, 00:12:43.016 "data_offset": 0, 00:12:43.016 "data_size": 65536 00:12:43.016 }, 00:12:43.017 { 00:12:43.017 "name": "BaseBdev3", 00:12:43.017 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:43.017 "is_configured": true, 00:12:43.017 "data_offset": 0, 00:12:43.017 "data_size": 65536 00:12:43.017 } 00:12:43.017 ] 00:12:43.017 }' 00:12:43.017 13:35:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.017 13:35:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.583 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.583 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:43.841 [2024-07-15 13:35:31.423055] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.841 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.098 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.098 "name": "Existed_Raid", 00:12:44.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:44.098 "strip_size_kb": 64, 00:12:44.098 "state": "configuring", 00:12:44.098 "raid_level": "concat", 00:12:44.098 "superblock": false, 00:12:44.098 "num_base_bdevs": 3, 00:12:44.098 "num_base_bdevs_discovered": 1, 00:12:44.098 "num_base_bdevs_operational": 3, 00:12:44.098 "base_bdevs_list": [ 00:12:44.098 { 00:12:44.098 "name": null, 00:12:44.098 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:44.098 "is_configured": false, 00:12:44.098 "data_offset": 0, 00:12:44.098 "data_size": 65536 00:12:44.098 }, 00:12:44.098 { 00:12:44.098 "name": null, 00:12:44.098 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:44.098 "is_configured": false, 00:12:44.098 "data_offset": 0, 00:12:44.098 "data_size": 65536 00:12:44.098 }, 00:12:44.098 { 00:12:44.098 "name": "BaseBdev3", 00:12:44.098 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:44.098 "is_configured": true, 00:12:44.098 "data_offset": 0, 00:12:44.098 "data_size": 65536 00:12:44.098 } 00:12:44.098 ] 00:12:44.098 }' 00:12:44.098 13:35:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.098 13:35:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.666 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.666 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:44.923 [2024-07-15 13:35:32.449469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.923 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.190 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.190 "name": "Existed_Raid", 00:12:45.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.190 "strip_size_kb": 64, 00:12:45.190 "state": "configuring", 00:12:45.190 "raid_level": "concat", 00:12:45.190 "superblock": false, 00:12:45.190 "num_base_bdevs": 3, 00:12:45.190 "num_base_bdevs_discovered": 2, 00:12:45.190 "num_base_bdevs_operational": 3, 00:12:45.190 "base_bdevs_list": [ 00:12:45.190 { 00:12:45.190 "name": null, 00:12:45.190 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:45.190 "is_configured": false, 00:12:45.190 "data_offset": 0, 00:12:45.190 "data_size": 65536 00:12:45.190 }, 00:12:45.190 { 00:12:45.190 "name": "BaseBdev2", 00:12:45.190 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:45.190 "is_configured": true, 00:12:45.190 "data_offset": 0, 00:12:45.190 "data_size": 65536 00:12:45.190 }, 00:12:45.190 { 00:12:45.190 "name": "BaseBdev3", 00:12:45.190 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:45.190 "is_configured": true, 00:12:45.190 "data_offset": 0, 00:12:45.190 "data_size": 65536 00:12:45.190 } 00:12:45.190 ] 00:12:45.190 }' 00:12:45.190 13:35:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.190 13:35:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:45.588 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.588 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:45.844 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:45.844 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.844 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:46.101 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 209bcc63-ad0f-4614-9bb3-38b5273709aa 00:12:46.101 [2024-07-15 13:35:33.643374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:46.102 [2024-07-15 13:35:33.643409] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2234ad0 00:12:46.102 [2024-07-15 13:35:33.643415] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:46.102 [2024-07-15 13:35:33.643560] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2235f20 00:12:46.102 [2024-07-15 13:35:33.643650] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2234ad0 00:12:46.102 [2024-07-15 13:35:33.643656] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2234ad0 00:12:46.102 [2024-07-15 13:35:33.643782] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:46.102 NewBaseBdev 00:12:46.102 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:46.102 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:46.102 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:46.102 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:46.102 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:46.102 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:46.102 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:46.359 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:46.359 [ 00:12:46.359 { 00:12:46.359 "name": "NewBaseBdev", 00:12:46.359 "aliases": [ 00:12:46.359 "209bcc63-ad0f-4614-9bb3-38b5273709aa" 00:12:46.359 ], 00:12:46.359 "product_name": "Malloc disk", 00:12:46.359 "block_size": 512, 00:12:46.359 "num_blocks": 65536, 00:12:46.359 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:46.359 "assigned_rate_limits": { 00:12:46.359 "rw_ios_per_sec": 0, 00:12:46.359 "rw_mbytes_per_sec": 0, 00:12:46.359 "r_mbytes_per_sec": 0, 00:12:46.359 "w_mbytes_per_sec": 0 00:12:46.359 }, 00:12:46.359 "claimed": true, 00:12:46.359 "claim_type": "exclusive_write", 00:12:46.359 "zoned": false, 00:12:46.359 "supported_io_types": { 00:12:46.359 "read": true, 00:12:46.359 "write": true, 00:12:46.359 "unmap": true, 00:12:46.359 "flush": true, 00:12:46.359 "reset": true, 00:12:46.359 "nvme_admin": false, 00:12:46.359 "nvme_io": false, 00:12:46.359 "nvme_io_md": false, 00:12:46.359 "write_zeroes": true, 00:12:46.359 "zcopy": true, 00:12:46.359 "get_zone_info": false, 00:12:46.359 "zone_management": false, 00:12:46.359 "zone_append": false, 00:12:46.359 "compare": false, 00:12:46.359 "compare_and_write": false, 00:12:46.359 "abort": true, 00:12:46.359 "seek_hole": false, 00:12:46.359 "seek_data": false, 00:12:46.359 "copy": true, 00:12:46.360 "nvme_iov_md": false 00:12:46.360 }, 00:12:46.360 "memory_domains": [ 00:12:46.360 { 00:12:46.360 "dma_device_id": "system", 00:12:46.360 "dma_device_type": 1 00:12:46.360 }, 00:12:46.360 { 00:12:46.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.360 "dma_device_type": 2 00:12:46.360 } 00:12:46.360 ], 00:12:46.360 "driver_specific": {} 00:12:46.360 } 00:12:46.360 ] 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.618 13:35:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:46.618 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.618 "name": "Existed_Raid", 00:12:46.618 "uuid": "691f216f-6cff-4d39-b37d-53581f8b5c22", 00:12:46.618 "strip_size_kb": 64, 00:12:46.618 "state": "online", 00:12:46.618 "raid_level": "concat", 00:12:46.618 "superblock": false, 00:12:46.618 "num_base_bdevs": 3, 00:12:46.618 "num_base_bdevs_discovered": 3, 00:12:46.618 "num_base_bdevs_operational": 3, 00:12:46.618 "base_bdevs_list": [ 00:12:46.618 { 00:12:46.618 "name": "NewBaseBdev", 00:12:46.618 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:46.618 "is_configured": true, 00:12:46.618 "data_offset": 0, 00:12:46.618 "data_size": 65536 00:12:46.618 }, 00:12:46.618 { 00:12:46.618 "name": "BaseBdev2", 00:12:46.618 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:46.618 "is_configured": true, 00:12:46.618 "data_offset": 0, 00:12:46.618 "data_size": 65536 00:12:46.618 }, 00:12:46.618 { 00:12:46.618 "name": "BaseBdev3", 00:12:46.618 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:46.618 "is_configured": true, 00:12:46.618 "data_offset": 0, 00:12:46.618 "data_size": 65536 00:12:46.618 } 00:12:46.618 ] 00:12:46.618 }' 00:12:46.618 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.618 13:35:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:47.183 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:47.441 [2024-07-15 13:35:34.826639] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:47.441 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:47.441 "name": "Existed_Raid", 00:12:47.441 "aliases": [ 00:12:47.441 "691f216f-6cff-4d39-b37d-53581f8b5c22" 00:12:47.441 ], 00:12:47.441 "product_name": "Raid Volume", 00:12:47.441 "block_size": 512, 00:12:47.441 "num_blocks": 196608, 00:12:47.441 "uuid": "691f216f-6cff-4d39-b37d-53581f8b5c22", 00:12:47.441 "assigned_rate_limits": { 00:12:47.441 "rw_ios_per_sec": 0, 00:12:47.441 "rw_mbytes_per_sec": 0, 00:12:47.441 "r_mbytes_per_sec": 0, 00:12:47.441 "w_mbytes_per_sec": 0 00:12:47.441 }, 00:12:47.441 "claimed": false, 00:12:47.441 "zoned": false, 00:12:47.441 "supported_io_types": { 00:12:47.441 "read": true, 00:12:47.441 "write": true, 00:12:47.441 "unmap": true, 00:12:47.441 "flush": true, 00:12:47.441 "reset": true, 00:12:47.441 "nvme_admin": false, 00:12:47.441 "nvme_io": false, 00:12:47.441 "nvme_io_md": false, 00:12:47.441 "write_zeroes": true, 00:12:47.441 "zcopy": false, 00:12:47.441 "get_zone_info": false, 00:12:47.441 "zone_management": false, 00:12:47.441 "zone_append": false, 00:12:47.441 "compare": false, 00:12:47.441 "compare_and_write": false, 00:12:47.441 "abort": false, 00:12:47.441 "seek_hole": false, 00:12:47.441 "seek_data": false, 00:12:47.441 "copy": false, 00:12:47.441 "nvme_iov_md": false 00:12:47.441 }, 00:12:47.441 "memory_domains": [ 00:12:47.441 { 00:12:47.441 "dma_device_id": "system", 00:12:47.441 "dma_device_type": 1 00:12:47.441 }, 00:12:47.441 { 00:12:47.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.441 "dma_device_type": 2 00:12:47.441 }, 00:12:47.441 { 00:12:47.441 "dma_device_id": "system", 00:12:47.441 "dma_device_type": 1 00:12:47.441 }, 00:12:47.441 { 00:12:47.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.441 "dma_device_type": 2 00:12:47.441 }, 00:12:47.441 { 00:12:47.441 "dma_device_id": "system", 00:12:47.441 "dma_device_type": 1 00:12:47.441 }, 00:12:47.441 { 00:12:47.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.441 "dma_device_type": 2 00:12:47.441 } 00:12:47.441 ], 00:12:47.441 "driver_specific": { 00:12:47.441 "raid": { 00:12:47.441 "uuid": "691f216f-6cff-4d39-b37d-53581f8b5c22", 00:12:47.441 "strip_size_kb": 64, 00:12:47.441 "state": "online", 00:12:47.441 "raid_level": "concat", 00:12:47.441 "superblock": false, 00:12:47.441 "num_base_bdevs": 3, 00:12:47.441 "num_base_bdevs_discovered": 3, 00:12:47.441 "num_base_bdevs_operational": 3, 00:12:47.441 "base_bdevs_list": [ 00:12:47.441 { 00:12:47.441 "name": "NewBaseBdev", 00:12:47.441 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:47.441 "is_configured": true, 00:12:47.441 "data_offset": 0, 00:12:47.441 "data_size": 65536 00:12:47.441 }, 00:12:47.441 { 00:12:47.441 "name": "BaseBdev2", 00:12:47.441 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:47.441 "is_configured": true, 00:12:47.441 "data_offset": 0, 00:12:47.441 "data_size": 65536 00:12:47.441 }, 00:12:47.441 { 00:12:47.441 "name": "BaseBdev3", 00:12:47.441 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:47.441 "is_configured": true, 00:12:47.441 "data_offset": 0, 00:12:47.441 "data_size": 65536 00:12:47.441 } 00:12:47.441 ] 00:12:47.441 } 00:12:47.441 } 00:12:47.441 }' 00:12:47.441 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:47.441 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:47.441 BaseBdev2 00:12:47.441 BaseBdev3' 00:12:47.441 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.441 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:47.441 13:35:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.699 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.699 "name": "NewBaseBdev", 00:12:47.699 "aliases": [ 00:12:47.699 "209bcc63-ad0f-4614-9bb3-38b5273709aa" 00:12:47.699 ], 00:12:47.699 "product_name": "Malloc disk", 00:12:47.699 "block_size": 512, 00:12:47.699 "num_blocks": 65536, 00:12:47.699 "uuid": "209bcc63-ad0f-4614-9bb3-38b5273709aa", 00:12:47.699 "assigned_rate_limits": { 00:12:47.699 "rw_ios_per_sec": 0, 00:12:47.699 "rw_mbytes_per_sec": 0, 00:12:47.699 "r_mbytes_per_sec": 0, 00:12:47.699 "w_mbytes_per_sec": 0 00:12:47.699 }, 00:12:47.700 "claimed": true, 00:12:47.700 "claim_type": "exclusive_write", 00:12:47.700 "zoned": false, 00:12:47.700 "supported_io_types": { 00:12:47.700 "read": true, 00:12:47.700 "write": true, 00:12:47.700 "unmap": true, 00:12:47.700 "flush": true, 00:12:47.700 "reset": true, 00:12:47.700 "nvme_admin": false, 00:12:47.700 "nvme_io": false, 00:12:47.700 "nvme_io_md": false, 00:12:47.700 "write_zeroes": true, 00:12:47.700 "zcopy": true, 00:12:47.700 "get_zone_info": false, 00:12:47.700 "zone_management": false, 00:12:47.700 "zone_append": false, 00:12:47.700 "compare": false, 00:12:47.700 "compare_and_write": false, 00:12:47.700 "abort": true, 00:12:47.700 "seek_hole": false, 00:12:47.700 "seek_data": false, 00:12:47.700 "copy": true, 00:12:47.700 "nvme_iov_md": false 00:12:47.700 }, 00:12:47.700 "memory_domains": [ 00:12:47.700 { 00:12:47.700 "dma_device_id": "system", 00:12:47.700 "dma_device_type": 1 00:12:47.700 }, 00:12:47.700 { 00:12:47.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.700 "dma_device_type": 2 00:12:47.700 } 00:12:47.700 ], 00:12:47.700 "driver_specific": {} 00:12:47.700 }' 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.700 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.956 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.956 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:47.956 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.956 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:47.956 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.956 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.956 "name": "BaseBdev2", 00:12:47.956 "aliases": [ 00:12:47.956 "058a2a87-2295-44e6-8f80-a5f2c8f329a0" 00:12:47.956 ], 00:12:47.956 "product_name": "Malloc disk", 00:12:47.956 "block_size": 512, 00:12:47.956 "num_blocks": 65536, 00:12:47.956 "uuid": "058a2a87-2295-44e6-8f80-a5f2c8f329a0", 00:12:47.956 "assigned_rate_limits": { 00:12:47.956 "rw_ios_per_sec": 0, 00:12:47.956 "rw_mbytes_per_sec": 0, 00:12:47.956 "r_mbytes_per_sec": 0, 00:12:47.956 "w_mbytes_per_sec": 0 00:12:47.956 }, 00:12:47.956 "claimed": true, 00:12:47.956 "claim_type": "exclusive_write", 00:12:47.956 "zoned": false, 00:12:47.956 "supported_io_types": { 00:12:47.956 "read": true, 00:12:47.956 "write": true, 00:12:47.956 "unmap": true, 00:12:47.956 "flush": true, 00:12:47.956 "reset": true, 00:12:47.956 "nvme_admin": false, 00:12:47.956 "nvme_io": false, 00:12:47.956 "nvme_io_md": false, 00:12:47.956 "write_zeroes": true, 00:12:47.956 "zcopy": true, 00:12:47.956 "get_zone_info": false, 00:12:47.956 "zone_management": false, 00:12:47.956 "zone_append": false, 00:12:47.956 "compare": false, 00:12:47.956 "compare_and_write": false, 00:12:47.956 "abort": true, 00:12:47.956 "seek_hole": false, 00:12:47.956 "seek_data": false, 00:12:47.956 "copy": true, 00:12:47.956 "nvme_iov_md": false 00:12:47.956 }, 00:12:47.956 "memory_domains": [ 00:12:47.956 { 00:12:47.956 "dma_device_id": "system", 00:12:47.956 "dma_device_type": 1 00:12:47.956 }, 00:12:47.956 { 00:12:47.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.956 "dma_device_type": 2 00:12:47.956 } 00:12:47.956 ], 00:12:47.957 "driver_specific": {} 00:12:47.957 }' 00:12:47.957 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.213 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.470 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.470 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.470 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:48.470 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:48.470 13:35:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:48.470 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:48.470 "name": "BaseBdev3", 00:12:48.470 "aliases": [ 00:12:48.470 "cfd8e6ff-6675-48b6-9883-c13c112a581d" 00:12:48.470 ], 00:12:48.470 "product_name": "Malloc disk", 00:12:48.470 "block_size": 512, 00:12:48.470 "num_blocks": 65536, 00:12:48.470 "uuid": "cfd8e6ff-6675-48b6-9883-c13c112a581d", 00:12:48.470 "assigned_rate_limits": { 00:12:48.470 "rw_ios_per_sec": 0, 00:12:48.470 "rw_mbytes_per_sec": 0, 00:12:48.470 "r_mbytes_per_sec": 0, 00:12:48.470 "w_mbytes_per_sec": 0 00:12:48.470 }, 00:12:48.470 "claimed": true, 00:12:48.470 "claim_type": "exclusive_write", 00:12:48.470 "zoned": false, 00:12:48.470 "supported_io_types": { 00:12:48.470 "read": true, 00:12:48.470 "write": true, 00:12:48.470 "unmap": true, 00:12:48.470 "flush": true, 00:12:48.470 "reset": true, 00:12:48.470 "nvme_admin": false, 00:12:48.470 "nvme_io": false, 00:12:48.470 "nvme_io_md": false, 00:12:48.470 "write_zeroes": true, 00:12:48.470 "zcopy": true, 00:12:48.470 "get_zone_info": false, 00:12:48.470 "zone_management": false, 00:12:48.470 "zone_append": false, 00:12:48.470 "compare": false, 00:12:48.470 "compare_and_write": false, 00:12:48.470 "abort": true, 00:12:48.470 "seek_hole": false, 00:12:48.470 "seek_data": false, 00:12:48.470 "copy": true, 00:12:48.470 "nvme_iov_md": false 00:12:48.470 }, 00:12:48.470 "memory_domains": [ 00:12:48.470 { 00:12:48.470 "dma_device_id": "system", 00:12:48.470 "dma_device_type": 1 00:12:48.470 }, 00:12:48.470 { 00:12:48.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.470 "dma_device_type": 2 00:12:48.470 } 00:12:48.470 ], 00:12:48.470 "driver_specific": {} 00:12:48.470 }' 00:12:48.470 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.728 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:48.987 [2024-07-15 13:35:36.522840] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:48.987 [2024-07-15 13:35:36.522864] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:48.987 [2024-07-15 13:35:36.522908] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:48.987 [2024-07-15 13:35:36.522944] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:48.987 [2024-07-15 13:35:36.522953] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2234ad0 name Existed_Raid, state offline 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4192624 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4192624 ']' 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4192624 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4192624 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4192624' 00:12:48.987 killing process with pid 4192624 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4192624 00:12:48.987 [2024-07-15 13:35:36.580674] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:48.987 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4192624 00:12:48.987 [2024-07-15 13:35:36.605415] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:49.245 13:35:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:49.245 00:12:49.245 real 0m21.852s 00:12:49.245 user 0m39.863s 00:12:49.245 sys 0m4.179s 00:12:49.245 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:49.245 13:35:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.245 ************************************ 00:12:49.245 END TEST raid_state_function_test 00:12:49.245 ************************************ 00:12:49.245 13:35:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:49.245 13:35:36 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:12:49.245 13:35:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:49.245 13:35:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:49.245 13:35:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:49.245 ************************************ 00:12:49.245 START TEST raid_state_function_test_sb 00:12:49.245 ************************************ 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2393 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2393' 00:12:49.503 Process raid pid: 2393 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2393 /var/tmp/spdk-raid.sock 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2393 ']' 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:49.503 13:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:49.504 13:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:49.504 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:49.504 13:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:49.504 13:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:49.504 [2024-07-15 13:35:36.922894] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:12:49.504 [2024-07-15 13:35:36.922950] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:49.504 [2024-07-15 13:35:37.011276] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:49.504 [2024-07-15 13:35:37.103018] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:49.762 [2024-07-15 13:35:37.160425] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:49.762 [2024-07-15 13:35:37.160449] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:50.327 13:35:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:50.327 13:35:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:50.327 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:50.327 [2024-07-15 13:35:37.880161] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:50.327 [2024-07-15 13:35:37.880195] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:50.327 [2024-07-15 13:35:37.880202] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:50.327 [2024-07-15 13:35:37.880225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:50.327 [2024-07-15 13:35:37.880231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:50.327 [2024-07-15 13:35:37.880238] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:50.327 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.328 13:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.585 13:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:50.585 "name": "Existed_Raid", 00:12:50.585 "uuid": "2f0cca1d-a083-4bc4-a2e6-2138c5dd2b73", 00:12:50.585 "strip_size_kb": 64, 00:12:50.585 "state": "configuring", 00:12:50.585 "raid_level": "concat", 00:12:50.585 "superblock": true, 00:12:50.585 "num_base_bdevs": 3, 00:12:50.585 "num_base_bdevs_discovered": 0, 00:12:50.585 "num_base_bdevs_operational": 3, 00:12:50.585 "base_bdevs_list": [ 00:12:50.585 { 00:12:50.585 "name": "BaseBdev1", 00:12:50.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.585 "is_configured": false, 00:12:50.585 "data_offset": 0, 00:12:50.585 "data_size": 0 00:12:50.585 }, 00:12:50.585 { 00:12:50.585 "name": "BaseBdev2", 00:12:50.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.585 "is_configured": false, 00:12:50.585 "data_offset": 0, 00:12:50.585 "data_size": 0 00:12:50.585 }, 00:12:50.585 { 00:12:50.585 "name": "BaseBdev3", 00:12:50.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.585 "is_configured": false, 00:12:50.585 "data_offset": 0, 00:12:50.585 "data_size": 0 00:12:50.585 } 00:12:50.585 ] 00:12:50.585 }' 00:12:50.585 13:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:50.585 13:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:51.176 13:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:51.176 [2024-07-15 13:35:38.678110] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:51.176 [2024-07-15 13:35:38.678132] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x804f50 name Existed_Raid, state configuring 00:12:51.176 13:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:51.433 [2024-07-15 13:35:38.850569] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:51.433 [2024-07-15 13:35:38.850588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:51.433 [2024-07-15 13:35:38.850593] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:51.433 [2024-07-15 13:35:38.850600] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:51.433 [2024-07-15 13:35:38.850606] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:51.433 [2024-07-15 13:35:38.850613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:51.433 13:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:51.433 [2024-07-15 13:35:39.031614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:51.433 BaseBdev1 00:12:51.433 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:51.433 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:51.433 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:51.433 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:51.433 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:51.433 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:51.433 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:51.691 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:51.949 [ 00:12:51.949 { 00:12:51.949 "name": "BaseBdev1", 00:12:51.949 "aliases": [ 00:12:51.949 "b74b88b3-d6d0-46b5-aa54-97982c1a43da" 00:12:51.949 ], 00:12:51.949 "product_name": "Malloc disk", 00:12:51.949 "block_size": 512, 00:12:51.949 "num_blocks": 65536, 00:12:51.949 "uuid": "b74b88b3-d6d0-46b5-aa54-97982c1a43da", 00:12:51.949 "assigned_rate_limits": { 00:12:51.949 "rw_ios_per_sec": 0, 00:12:51.949 "rw_mbytes_per_sec": 0, 00:12:51.949 "r_mbytes_per_sec": 0, 00:12:51.949 "w_mbytes_per_sec": 0 00:12:51.949 }, 00:12:51.949 "claimed": true, 00:12:51.949 "claim_type": "exclusive_write", 00:12:51.949 "zoned": false, 00:12:51.949 "supported_io_types": { 00:12:51.949 "read": true, 00:12:51.949 "write": true, 00:12:51.949 "unmap": true, 00:12:51.949 "flush": true, 00:12:51.949 "reset": true, 00:12:51.949 "nvme_admin": false, 00:12:51.949 "nvme_io": false, 00:12:51.949 "nvme_io_md": false, 00:12:51.949 "write_zeroes": true, 00:12:51.949 "zcopy": true, 00:12:51.949 "get_zone_info": false, 00:12:51.949 "zone_management": false, 00:12:51.949 "zone_append": false, 00:12:51.949 "compare": false, 00:12:51.949 "compare_and_write": false, 00:12:51.949 "abort": true, 00:12:51.949 "seek_hole": false, 00:12:51.949 "seek_data": false, 00:12:51.949 "copy": true, 00:12:51.949 "nvme_iov_md": false 00:12:51.949 }, 00:12:51.949 "memory_domains": [ 00:12:51.949 { 00:12:51.949 "dma_device_id": "system", 00:12:51.949 "dma_device_type": 1 00:12:51.949 }, 00:12:51.949 { 00:12:51.949 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.949 "dma_device_type": 2 00:12:51.949 } 00:12:51.949 ], 00:12:51.949 "driver_specific": {} 00:12:51.949 } 00:12:51.949 ] 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.949 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:52.207 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.207 "name": "Existed_Raid", 00:12:52.207 "uuid": "0e34b70d-574c-41e7-b9d3-d76b69c59d4e", 00:12:52.207 "strip_size_kb": 64, 00:12:52.207 "state": "configuring", 00:12:52.207 "raid_level": "concat", 00:12:52.207 "superblock": true, 00:12:52.207 "num_base_bdevs": 3, 00:12:52.207 "num_base_bdevs_discovered": 1, 00:12:52.207 "num_base_bdevs_operational": 3, 00:12:52.207 "base_bdevs_list": [ 00:12:52.207 { 00:12:52.207 "name": "BaseBdev1", 00:12:52.207 "uuid": "b74b88b3-d6d0-46b5-aa54-97982c1a43da", 00:12:52.207 "is_configured": true, 00:12:52.207 "data_offset": 2048, 00:12:52.207 "data_size": 63488 00:12:52.207 }, 00:12:52.207 { 00:12:52.207 "name": "BaseBdev2", 00:12:52.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:52.207 "is_configured": false, 00:12:52.207 "data_offset": 0, 00:12:52.207 "data_size": 0 00:12:52.207 }, 00:12:52.207 { 00:12:52.207 "name": "BaseBdev3", 00:12:52.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:52.207 "is_configured": false, 00:12:52.207 "data_offset": 0, 00:12:52.207 "data_size": 0 00:12:52.207 } 00:12:52.207 ] 00:12:52.207 }' 00:12:52.207 13:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.207 13:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:52.770 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:52.770 [2024-07-15 13:35:40.250749] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:52.770 [2024-07-15 13:35:40.250786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x804820 name Existed_Raid, state configuring 00:12:52.770 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:53.028 [2024-07-15 13:35:40.427231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:53.028 [2024-07-15 13:35:40.428268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:53.028 [2024-07-15 13:35:40.428290] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:53.028 [2024-07-15 13:35:40.428297] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:53.028 [2024-07-15 13:35:40.428304] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.028 "name": "Existed_Raid", 00:12:53.028 "uuid": "b9839fb7-7930-44fe-9f53-60da9e1e32b5", 00:12:53.028 "strip_size_kb": 64, 00:12:53.028 "state": "configuring", 00:12:53.028 "raid_level": "concat", 00:12:53.028 "superblock": true, 00:12:53.028 "num_base_bdevs": 3, 00:12:53.028 "num_base_bdevs_discovered": 1, 00:12:53.028 "num_base_bdevs_operational": 3, 00:12:53.028 "base_bdevs_list": [ 00:12:53.028 { 00:12:53.028 "name": "BaseBdev1", 00:12:53.028 "uuid": "b74b88b3-d6d0-46b5-aa54-97982c1a43da", 00:12:53.028 "is_configured": true, 00:12:53.028 "data_offset": 2048, 00:12:53.028 "data_size": 63488 00:12:53.028 }, 00:12:53.028 { 00:12:53.028 "name": "BaseBdev2", 00:12:53.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.028 "is_configured": false, 00:12:53.028 "data_offset": 0, 00:12:53.028 "data_size": 0 00:12:53.028 }, 00:12:53.028 { 00:12:53.028 "name": "BaseBdev3", 00:12:53.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.028 "is_configured": false, 00:12:53.028 "data_offset": 0, 00:12:53.028 "data_size": 0 00:12:53.028 } 00:12:53.028 ] 00:12:53.028 }' 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.028 13:35:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:53.591 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:53.848 [2024-07-15 13:35:41.280197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:53.848 BaseBdev2 00:12:53.848 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:53.848 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:53.848 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:53.848 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:53.848 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:53.848 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:53.848 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:54.105 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:54.105 [ 00:12:54.105 { 00:12:54.105 "name": "BaseBdev2", 00:12:54.105 "aliases": [ 00:12:54.105 "06dc36d0-81fd-471e-aa1c-10da7c4334e3" 00:12:54.105 ], 00:12:54.105 "product_name": "Malloc disk", 00:12:54.105 "block_size": 512, 00:12:54.105 "num_blocks": 65536, 00:12:54.105 "uuid": "06dc36d0-81fd-471e-aa1c-10da7c4334e3", 00:12:54.105 "assigned_rate_limits": { 00:12:54.105 "rw_ios_per_sec": 0, 00:12:54.105 "rw_mbytes_per_sec": 0, 00:12:54.105 "r_mbytes_per_sec": 0, 00:12:54.105 "w_mbytes_per_sec": 0 00:12:54.105 }, 00:12:54.105 "claimed": true, 00:12:54.105 "claim_type": "exclusive_write", 00:12:54.105 "zoned": false, 00:12:54.105 "supported_io_types": { 00:12:54.105 "read": true, 00:12:54.105 "write": true, 00:12:54.105 "unmap": true, 00:12:54.105 "flush": true, 00:12:54.105 "reset": true, 00:12:54.105 "nvme_admin": false, 00:12:54.105 "nvme_io": false, 00:12:54.106 "nvme_io_md": false, 00:12:54.106 "write_zeroes": true, 00:12:54.106 "zcopy": true, 00:12:54.106 "get_zone_info": false, 00:12:54.106 "zone_management": false, 00:12:54.106 "zone_append": false, 00:12:54.106 "compare": false, 00:12:54.106 "compare_and_write": false, 00:12:54.106 "abort": true, 00:12:54.106 "seek_hole": false, 00:12:54.106 "seek_data": false, 00:12:54.106 "copy": true, 00:12:54.106 "nvme_iov_md": false 00:12:54.106 }, 00:12:54.106 "memory_domains": [ 00:12:54.106 { 00:12:54.106 "dma_device_id": "system", 00:12:54.106 "dma_device_type": 1 00:12:54.106 }, 00:12:54.106 { 00:12:54.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.106 "dma_device_type": 2 00:12:54.106 } 00:12:54.106 ], 00:12:54.106 "driver_specific": {} 00:12:54.106 } 00:12:54.106 ] 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.106 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:54.363 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.363 "name": "Existed_Raid", 00:12:54.363 "uuid": "b9839fb7-7930-44fe-9f53-60da9e1e32b5", 00:12:54.363 "strip_size_kb": 64, 00:12:54.363 "state": "configuring", 00:12:54.363 "raid_level": "concat", 00:12:54.363 "superblock": true, 00:12:54.363 "num_base_bdevs": 3, 00:12:54.363 "num_base_bdevs_discovered": 2, 00:12:54.363 "num_base_bdevs_operational": 3, 00:12:54.363 "base_bdevs_list": [ 00:12:54.363 { 00:12:54.363 "name": "BaseBdev1", 00:12:54.363 "uuid": "b74b88b3-d6d0-46b5-aa54-97982c1a43da", 00:12:54.363 "is_configured": true, 00:12:54.363 "data_offset": 2048, 00:12:54.363 "data_size": 63488 00:12:54.363 }, 00:12:54.363 { 00:12:54.363 "name": "BaseBdev2", 00:12:54.363 "uuid": "06dc36d0-81fd-471e-aa1c-10da7c4334e3", 00:12:54.363 "is_configured": true, 00:12:54.363 "data_offset": 2048, 00:12:54.363 "data_size": 63488 00:12:54.363 }, 00:12:54.363 { 00:12:54.363 "name": "BaseBdev3", 00:12:54.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:54.363 "is_configured": false, 00:12:54.363 "data_offset": 0, 00:12:54.363 "data_size": 0 00:12:54.363 } 00:12:54.363 ] 00:12:54.363 }' 00:12:54.363 13:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.363 13:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:54.929 [2024-07-15 13:35:42.470192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:54.929 [2024-07-15 13:35:42.470321] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x805710 00:12:54.929 [2024-07-15 13:35:42.470331] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:54.929 [2024-07-15 13:35:42.470452] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8053e0 00:12:54.929 [2024-07-15 13:35:42.470536] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x805710 00:12:54.929 [2024-07-15 13:35:42.470543] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x805710 00:12:54.929 [2024-07-15 13:35:42.470603] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:54.929 BaseBdev3 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:54.929 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:55.187 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:55.445 [ 00:12:55.445 { 00:12:55.445 "name": "BaseBdev3", 00:12:55.445 "aliases": [ 00:12:55.445 "05af4a24-d272-4dee-918a-ddc288cec3e3" 00:12:55.445 ], 00:12:55.445 "product_name": "Malloc disk", 00:12:55.445 "block_size": 512, 00:12:55.445 "num_blocks": 65536, 00:12:55.445 "uuid": "05af4a24-d272-4dee-918a-ddc288cec3e3", 00:12:55.445 "assigned_rate_limits": { 00:12:55.445 "rw_ios_per_sec": 0, 00:12:55.445 "rw_mbytes_per_sec": 0, 00:12:55.445 "r_mbytes_per_sec": 0, 00:12:55.445 "w_mbytes_per_sec": 0 00:12:55.445 }, 00:12:55.445 "claimed": true, 00:12:55.445 "claim_type": "exclusive_write", 00:12:55.445 "zoned": false, 00:12:55.445 "supported_io_types": { 00:12:55.445 "read": true, 00:12:55.445 "write": true, 00:12:55.445 "unmap": true, 00:12:55.445 "flush": true, 00:12:55.445 "reset": true, 00:12:55.445 "nvme_admin": false, 00:12:55.445 "nvme_io": false, 00:12:55.445 "nvme_io_md": false, 00:12:55.445 "write_zeroes": true, 00:12:55.445 "zcopy": true, 00:12:55.445 "get_zone_info": false, 00:12:55.445 "zone_management": false, 00:12:55.445 "zone_append": false, 00:12:55.445 "compare": false, 00:12:55.445 "compare_and_write": false, 00:12:55.445 "abort": true, 00:12:55.445 "seek_hole": false, 00:12:55.445 "seek_data": false, 00:12:55.445 "copy": true, 00:12:55.445 "nvme_iov_md": false 00:12:55.445 }, 00:12:55.445 "memory_domains": [ 00:12:55.445 { 00:12:55.445 "dma_device_id": "system", 00:12:55.445 "dma_device_type": 1 00:12:55.445 }, 00:12:55.445 { 00:12:55.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.445 "dma_device_type": 2 00:12:55.445 } 00:12:55.445 ], 00:12:55.445 "driver_specific": {} 00:12:55.445 } 00:12:55.445 ] 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.445 13:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:55.445 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.445 "name": "Existed_Raid", 00:12:55.445 "uuid": "b9839fb7-7930-44fe-9f53-60da9e1e32b5", 00:12:55.445 "strip_size_kb": 64, 00:12:55.445 "state": "online", 00:12:55.445 "raid_level": "concat", 00:12:55.445 "superblock": true, 00:12:55.445 "num_base_bdevs": 3, 00:12:55.445 "num_base_bdevs_discovered": 3, 00:12:55.445 "num_base_bdevs_operational": 3, 00:12:55.445 "base_bdevs_list": [ 00:12:55.445 { 00:12:55.445 "name": "BaseBdev1", 00:12:55.445 "uuid": "b74b88b3-d6d0-46b5-aa54-97982c1a43da", 00:12:55.445 "is_configured": true, 00:12:55.445 "data_offset": 2048, 00:12:55.445 "data_size": 63488 00:12:55.445 }, 00:12:55.445 { 00:12:55.445 "name": "BaseBdev2", 00:12:55.445 "uuid": "06dc36d0-81fd-471e-aa1c-10da7c4334e3", 00:12:55.445 "is_configured": true, 00:12:55.445 "data_offset": 2048, 00:12:55.445 "data_size": 63488 00:12:55.445 }, 00:12:55.445 { 00:12:55.445 "name": "BaseBdev3", 00:12:55.445 "uuid": "05af4a24-d272-4dee-918a-ddc288cec3e3", 00:12:55.445 "is_configured": true, 00:12:55.445 "data_offset": 2048, 00:12:55.445 "data_size": 63488 00:12:55.445 } 00:12:55.445 ] 00:12:55.445 }' 00:12:55.445 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.445 13:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:56.011 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:56.269 [2024-07-15 13:35:43.665488] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:56.269 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:56.269 "name": "Existed_Raid", 00:12:56.269 "aliases": [ 00:12:56.269 "b9839fb7-7930-44fe-9f53-60da9e1e32b5" 00:12:56.270 ], 00:12:56.270 "product_name": "Raid Volume", 00:12:56.270 "block_size": 512, 00:12:56.270 "num_blocks": 190464, 00:12:56.270 "uuid": "b9839fb7-7930-44fe-9f53-60da9e1e32b5", 00:12:56.270 "assigned_rate_limits": { 00:12:56.270 "rw_ios_per_sec": 0, 00:12:56.270 "rw_mbytes_per_sec": 0, 00:12:56.270 "r_mbytes_per_sec": 0, 00:12:56.270 "w_mbytes_per_sec": 0 00:12:56.270 }, 00:12:56.270 "claimed": false, 00:12:56.270 "zoned": false, 00:12:56.270 "supported_io_types": { 00:12:56.270 "read": true, 00:12:56.270 "write": true, 00:12:56.270 "unmap": true, 00:12:56.270 "flush": true, 00:12:56.270 "reset": true, 00:12:56.270 "nvme_admin": false, 00:12:56.270 "nvme_io": false, 00:12:56.270 "nvme_io_md": false, 00:12:56.270 "write_zeroes": true, 00:12:56.270 "zcopy": false, 00:12:56.270 "get_zone_info": false, 00:12:56.270 "zone_management": false, 00:12:56.270 "zone_append": false, 00:12:56.270 "compare": false, 00:12:56.270 "compare_and_write": false, 00:12:56.270 "abort": false, 00:12:56.270 "seek_hole": false, 00:12:56.270 "seek_data": false, 00:12:56.270 "copy": false, 00:12:56.270 "nvme_iov_md": false 00:12:56.270 }, 00:12:56.270 "memory_domains": [ 00:12:56.270 { 00:12:56.270 "dma_device_id": "system", 00:12:56.270 "dma_device_type": 1 00:12:56.270 }, 00:12:56.270 { 00:12:56.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.270 "dma_device_type": 2 00:12:56.270 }, 00:12:56.270 { 00:12:56.270 "dma_device_id": "system", 00:12:56.270 "dma_device_type": 1 00:12:56.270 }, 00:12:56.270 { 00:12:56.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.270 "dma_device_type": 2 00:12:56.270 }, 00:12:56.270 { 00:12:56.270 "dma_device_id": "system", 00:12:56.270 "dma_device_type": 1 00:12:56.270 }, 00:12:56.270 { 00:12:56.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.270 "dma_device_type": 2 00:12:56.270 } 00:12:56.270 ], 00:12:56.270 "driver_specific": { 00:12:56.270 "raid": { 00:12:56.270 "uuid": "b9839fb7-7930-44fe-9f53-60da9e1e32b5", 00:12:56.270 "strip_size_kb": 64, 00:12:56.270 "state": "online", 00:12:56.270 "raid_level": "concat", 00:12:56.270 "superblock": true, 00:12:56.270 "num_base_bdevs": 3, 00:12:56.270 "num_base_bdevs_discovered": 3, 00:12:56.270 "num_base_bdevs_operational": 3, 00:12:56.270 "base_bdevs_list": [ 00:12:56.270 { 00:12:56.270 "name": "BaseBdev1", 00:12:56.270 "uuid": "b74b88b3-d6d0-46b5-aa54-97982c1a43da", 00:12:56.270 "is_configured": true, 00:12:56.270 "data_offset": 2048, 00:12:56.270 "data_size": 63488 00:12:56.270 }, 00:12:56.270 { 00:12:56.270 "name": "BaseBdev2", 00:12:56.270 "uuid": "06dc36d0-81fd-471e-aa1c-10da7c4334e3", 00:12:56.270 "is_configured": true, 00:12:56.270 "data_offset": 2048, 00:12:56.270 "data_size": 63488 00:12:56.270 }, 00:12:56.270 { 00:12:56.270 "name": "BaseBdev3", 00:12:56.270 "uuid": "05af4a24-d272-4dee-918a-ddc288cec3e3", 00:12:56.270 "is_configured": true, 00:12:56.270 "data_offset": 2048, 00:12:56.270 "data_size": 63488 00:12:56.270 } 00:12:56.270 ] 00:12:56.270 } 00:12:56.270 } 00:12:56.270 }' 00:12:56.270 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:56.270 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:56.270 BaseBdev2 00:12:56.270 BaseBdev3' 00:12:56.270 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:56.270 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:56.270 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:56.528 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:56.528 "name": "BaseBdev1", 00:12:56.528 "aliases": [ 00:12:56.528 "b74b88b3-d6d0-46b5-aa54-97982c1a43da" 00:12:56.528 ], 00:12:56.528 "product_name": "Malloc disk", 00:12:56.528 "block_size": 512, 00:12:56.528 "num_blocks": 65536, 00:12:56.528 "uuid": "b74b88b3-d6d0-46b5-aa54-97982c1a43da", 00:12:56.528 "assigned_rate_limits": { 00:12:56.528 "rw_ios_per_sec": 0, 00:12:56.528 "rw_mbytes_per_sec": 0, 00:12:56.528 "r_mbytes_per_sec": 0, 00:12:56.528 "w_mbytes_per_sec": 0 00:12:56.528 }, 00:12:56.528 "claimed": true, 00:12:56.528 "claim_type": "exclusive_write", 00:12:56.528 "zoned": false, 00:12:56.528 "supported_io_types": { 00:12:56.528 "read": true, 00:12:56.528 "write": true, 00:12:56.528 "unmap": true, 00:12:56.528 "flush": true, 00:12:56.528 "reset": true, 00:12:56.528 "nvme_admin": false, 00:12:56.528 "nvme_io": false, 00:12:56.528 "nvme_io_md": false, 00:12:56.528 "write_zeroes": true, 00:12:56.528 "zcopy": true, 00:12:56.528 "get_zone_info": false, 00:12:56.528 "zone_management": false, 00:12:56.528 "zone_append": false, 00:12:56.528 "compare": false, 00:12:56.528 "compare_and_write": false, 00:12:56.528 "abort": true, 00:12:56.528 "seek_hole": false, 00:12:56.528 "seek_data": false, 00:12:56.528 "copy": true, 00:12:56.528 "nvme_iov_md": false 00:12:56.528 }, 00:12:56.528 "memory_domains": [ 00:12:56.528 { 00:12:56.528 "dma_device_id": "system", 00:12:56.528 "dma_device_type": 1 00:12:56.528 }, 00:12:56.528 { 00:12:56.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.528 "dma_device_type": 2 00:12:56.528 } 00:12:56.528 ], 00:12:56.528 "driver_specific": {} 00:12:56.528 }' 00:12:56.528 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.528 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.528 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:56.528 13:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.528 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.528 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:56.528 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.528 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.528 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:56.528 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.786 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.786 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:56.786 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:56.786 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:56.786 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.044 "name": "BaseBdev2", 00:12:57.044 "aliases": [ 00:12:57.044 "06dc36d0-81fd-471e-aa1c-10da7c4334e3" 00:12:57.044 ], 00:12:57.044 "product_name": "Malloc disk", 00:12:57.044 "block_size": 512, 00:12:57.044 "num_blocks": 65536, 00:12:57.044 "uuid": "06dc36d0-81fd-471e-aa1c-10da7c4334e3", 00:12:57.044 "assigned_rate_limits": { 00:12:57.044 "rw_ios_per_sec": 0, 00:12:57.044 "rw_mbytes_per_sec": 0, 00:12:57.044 "r_mbytes_per_sec": 0, 00:12:57.044 "w_mbytes_per_sec": 0 00:12:57.044 }, 00:12:57.044 "claimed": true, 00:12:57.044 "claim_type": "exclusive_write", 00:12:57.044 "zoned": false, 00:12:57.044 "supported_io_types": { 00:12:57.044 "read": true, 00:12:57.044 "write": true, 00:12:57.044 "unmap": true, 00:12:57.044 "flush": true, 00:12:57.044 "reset": true, 00:12:57.044 "nvme_admin": false, 00:12:57.044 "nvme_io": false, 00:12:57.044 "nvme_io_md": false, 00:12:57.044 "write_zeroes": true, 00:12:57.044 "zcopy": true, 00:12:57.044 "get_zone_info": false, 00:12:57.044 "zone_management": false, 00:12:57.044 "zone_append": false, 00:12:57.044 "compare": false, 00:12:57.044 "compare_and_write": false, 00:12:57.044 "abort": true, 00:12:57.044 "seek_hole": false, 00:12:57.044 "seek_data": false, 00:12:57.044 "copy": true, 00:12:57.044 "nvme_iov_md": false 00:12:57.044 }, 00:12:57.044 "memory_domains": [ 00:12:57.044 { 00:12:57.044 "dma_device_id": "system", 00:12:57.044 "dma_device_type": 1 00:12:57.044 }, 00:12:57.044 { 00:12:57.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.044 "dma_device_type": 2 00:12:57.044 } 00:12:57.044 ], 00:12:57.044 "driver_specific": {} 00:12:57.044 }' 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.044 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.309 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.309 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:57.309 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:57.309 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:57.309 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.309 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.309 "name": "BaseBdev3", 00:12:57.309 "aliases": [ 00:12:57.309 "05af4a24-d272-4dee-918a-ddc288cec3e3" 00:12:57.309 ], 00:12:57.309 "product_name": "Malloc disk", 00:12:57.309 "block_size": 512, 00:12:57.309 "num_blocks": 65536, 00:12:57.309 "uuid": "05af4a24-d272-4dee-918a-ddc288cec3e3", 00:12:57.309 "assigned_rate_limits": { 00:12:57.309 "rw_ios_per_sec": 0, 00:12:57.309 "rw_mbytes_per_sec": 0, 00:12:57.309 "r_mbytes_per_sec": 0, 00:12:57.309 "w_mbytes_per_sec": 0 00:12:57.309 }, 00:12:57.309 "claimed": true, 00:12:57.309 "claim_type": "exclusive_write", 00:12:57.309 "zoned": false, 00:12:57.309 "supported_io_types": { 00:12:57.309 "read": true, 00:12:57.309 "write": true, 00:12:57.309 "unmap": true, 00:12:57.309 "flush": true, 00:12:57.309 "reset": true, 00:12:57.309 "nvme_admin": false, 00:12:57.309 "nvme_io": false, 00:12:57.309 "nvme_io_md": false, 00:12:57.309 "write_zeroes": true, 00:12:57.309 "zcopy": true, 00:12:57.309 "get_zone_info": false, 00:12:57.309 "zone_management": false, 00:12:57.309 "zone_append": false, 00:12:57.309 "compare": false, 00:12:57.309 "compare_and_write": false, 00:12:57.309 "abort": true, 00:12:57.309 "seek_hole": false, 00:12:57.309 "seek_data": false, 00:12:57.309 "copy": true, 00:12:57.309 "nvme_iov_md": false 00:12:57.309 }, 00:12:57.309 "memory_domains": [ 00:12:57.309 { 00:12:57.309 "dma_device_id": "system", 00:12:57.309 "dma_device_type": 1 00:12:57.309 }, 00:12:57.309 { 00:12:57.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.309 "dma_device_type": 2 00:12:57.309 } 00:12:57.309 ], 00:12:57.309 "driver_specific": {} 00:12:57.309 }' 00:12:57.309 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.567 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.567 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:57.567 13:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.567 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.567 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.567 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.567 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.567 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.568 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.568 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:57.826 [2024-07-15 13:35:45.361705] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:57.826 [2024-07-15 13:35:45.361729] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:57.826 [2024-07-15 13:35:45.361758] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.826 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.084 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.084 "name": "Existed_Raid", 00:12:58.084 "uuid": "b9839fb7-7930-44fe-9f53-60da9e1e32b5", 00:12:58.084 "strip_size_kb": 64, 00:12:58.084 "state": "offline", 00:12:58.084 "raid_level": "concat", 00:12:58.084 "superblock": true, 00:12:58.084 "num_base_bdevs": 3, 00:12:58.084 "num_base_bdevs_discovered": 2, 00:12:58.084 "num_base_bdevs_operational": 2, 00:12:58.084 "base_bdevs_list": [ 00:12:58.084 { 00:12:58.084 "name": null, 00:12:58.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.084 "is_configured": false, 00:12:58.084 "data_offset": 2048, 00:12:58.084 "data_size": 63488 00:12:58.084 }, 00:12:58.084 { 00:12:58.084 "name": "BaseBdev2", 00:12:58.084 "uuid": "06dc36d0-81fd-471e-aa1c-10da7c4334e3", 00:12:58.084 "is_configured": true, 00:12:58.084 "data_offset": 2048, 00:12:58.084 "data_size": 63488 00:12:58.084 }, 00:12:58.084 { 00:12:58.084 "name": "BaseBdev3", 00:12:58.084 "uuid": "05af4a24-d272-4dee-918a-ddc288cec3e3", 00:12:58.084 "is_configured": true, 00:12:58.084 "data_offset": 2048, 00:12:58.084 "data_size": 63488 00:12:58.084 } 00:12:58.084 ] 00:12:58.084 }' 00:12:58.084 13:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.084 13:35:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:58.649 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:58.649 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:58.649 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.649 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:58.649 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:58.649 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:58.649 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:58.907 [2024-07-15 13:35:46.361045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:58.907 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:58.907 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:58.907 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:58.907 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.166 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:59.166 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:59.166 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:59.166 [2024-07-15 13:35:46.717832] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:59.166 [2024-07-15 13:35:46.717867] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x805710 name Existed_Raid, state offline 00:12:59.166 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:59.166 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:59.166 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.166 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:59.425 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:59.425 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:59.425 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:59.425 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:59.425 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:59.425 13:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:59.684 BaseBdev2 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:59.684 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:59.942 [ 00:12:59.942 { 00:12:59.942 "name": "BaseBdev2", 00:12:59.942 "aliases": [ 00:12:59.942 "27b9491c-ddf0-4d13-9703-46887c5597f8" 00:12:59.942 ], 00:12:59.942 "product_name": "Malloc disk", 00:12:59.942 "block_size": 512, 00:12:59.942 "num_blocks": 65536, 00:12:59.942 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:12:59.942 "assigned_rate_limits": { 00:12:59.942 "rw_ios_per_sec": 0, 00:12:59.942 "rw_mbytes_per_sec": 0, 00:12:59.942 "r_mbytes_per_sec": 0, 00:12:59.942 "w_mbytes_per_sec": 0 00:12:59.942 }, 00:12:59.942 "claimed": false, 00:12:59.942 "zoned": false, 00:12:59.942 "supported_io_types": { 00:12:59.942 "read": true, 00:12:59.942 "write": true, 00:12:59.942 "unmap": true, 00:12:59.942 "flush": true, 00:12:59.942 "reset": true, 00:12:59.942 "nvme_admin": false, 00:12:59.942 "nvme_io": false, 00:12:59.942 "nvme_io_md": false, 00:12:59.942 "write_zeroes": true, 00:12:59.942 "zcopy": true, 00:12:59.942 "get_zone_info": false, 00:12:59.942 "zone_management": false, 00:12:59.942 "zone_append": false, 00:12:59.942 "compare": false, 00:12:59.942 "compare_and_write": false, 00:12:59.942 "abort": true, 00:12:59.942 "seek_hole": false, 00:12:59.942 "seek_data": false, 00:12:59.942 "copy": true, 00:12:59.942 "nvme_iov_md": false 00:12:59.942 }, 00:12:59.942 "memory_domains": [ 00:12:59.942 { 00:12:59.942 "dma_device_id": "system", 00:12:59.942 "dma_device_type": 1 00:12:59.942 }, 00:12:59.942 { 00:12:59.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.942 "dma_device_type": 2 00:12:59.942 } 00:12:59.942 ], 00:12:59.942 "driver_specific": {} 00:12:59.942 } 00:12:59.942 ] 00:12:59.942 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:59.942 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:59.942 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:59.942 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:00.200 BaseBdev3 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:00.200 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:00.458 [ 00:13:00.458 { 00:13:00.458 "name": "BaseBdev3", 00:13:00.458 "aliases": [ 00:13:00.458 "50f9a9c1-54aa-458a-91b6-8806b7fac31c" 00:13:00.458 ], 00:13:00.458 "product_name": "Malloc disk", 00:13:00.458 "block_size": 512, 00:13:00.458 "num_blocks": 65536, 00:13:00.458 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:00.458 "assigned_rate_limits": { 00:13:00.458 "rw_ios_per_sec": 0, 00:13:00.458 "rw_mbytes_per_sec": 0, 00:13:00.458 "r_mbytes_per_sec": 0, 00:13:00.458 "w_mbytes_per_sec": 0 00:13:00.458 }, 00:13:00.458 "claimed": false, 00:13:00.458 "zoned": false, 00:13:00.458 "supported_io_types": { 00:13:00.458 "read": true, 00:13:00.458 "write": true, 00:13:00.458 "unmap": true, 00:13:00.458 "flush": true, 00:13:00.458 "reset": true, 00:13:00.458 "nvme_admin": false, 00:13:00.458 "nvme_io": false, 00:13:00.458 "nvme_io_md": false, 00:13:00.458 "write_zeroes": true, 00:13:00.458 "zcopy": true, 00:13:00.458 "get_zone_info": false, 00:13:00.458 "zone_management": false, 00:13:00.458 "zone_append": false, 00:13:00.458 "compare": false, 00:13:00.458 "compare_and_write": false, 00:13:00.458 "abort": true, 00:13:00.458 "seek_hole": false, 00:13:00.458 "seek_data": false, 00:13:00.458 "copy": true, 00:13:00.458 "nvme_iov_md": false 00:13:00.458 }, 00:13:00.458 "memory_domains": [ 00:13:00.458 { 00:13:00.458 "dma_device_id": "system", 00:13:00.458 "dma_device_type": 1 00:13:00.458 }, 00:13:00.458 { 00:13:00.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.458 "dma_device_type": 2 00:13:00.458 } 00:13:00.458 ], 00:13:00.458 "driver_specific": {} 00:13:00.458 } 00:13:00.458 ] 00:13:00.458 13:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:00.458 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:00.458 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:00.458 13:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:00.458 [2024-07-15 13:35:48.075082] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:00.458 [2024-07-15 13:35:48.075117] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:00.458 [2024-07-15 13:35:48.075131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:00.458 [2024-07-15 13:35:48.076170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.716 "name": "Existed_Raid", 00:13:00.716 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:00.716 "strip_size_kb": 64, 00:13:00.716 "state": "configuring", 00:13:00.716 "raid_level": "concat", 00:13:00.716 "superblock": true, 00:13:00.716 "num_base_bdevs": 3, 00:13:00.716 "num_base_bdevs_discovered": 2, 00:13:00.716 "num_base_bdevs_operational": 3, 00:13:00.716 "base_bdevs_list": [ 00:13:00.716 { 00:13:00.716 "name": "BaseBdev1", 00:13:00.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.716 "is_configured": false, 00:13:00.716 "data_offset": 0, 00:13:00.716 "data_size": 0 00:13:00.716 }, 00:13:00.716 { 00:13:00.716 "name": "BaseBdev2", 00:13:00.716 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:00.716 "is_configured": true, 00:13:00.716 "data_offset": 2048, 00:13:00.716 "data_size": 63488 00:13:00.716 }, 00:13:00.716 { 00:13:00.716 "name": "BaseBdev3", 00:13:00.716 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:00.716 "is_configured": true, 00:13:00.716 "data_offset": 2048, 00:13:00.716 "data_size": 63488 00:13:00.716 } 00:13:00.716 ] 00:13:00.716 }' 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.716 13:35:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:01.282 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:01.540 [2024-07-15 13:35:48.905187] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.540 13:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.540 13:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.540 "name": "Existed_Raid", 00:13:01.540 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:01.540 "strip_size_kb": 64, 00:13:01.540 "state": "configuring", 00:13:01.540 "raid_level": "concat", 00:13:01.540 "superblock": true, 00:13:01.540 "num_base_bdevs": 3, 00:13:01.540 "num_base_bdevs_discovered": 1, 00:13:01.540 "num_base_bdevs_operational": 3, 00:13:01.540 "base_bdevs_list": [ 00:13:01.540 { 00:13:01.540 "name": "BaseBdev1", 00:13:01.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.540 "is_configured": false, 00:13:01.540 "data_offset": 0, 00:13:01.540 "data_size": 0 00:13:01.540 }, 00:13:01.540 { 00:13:01.540 "name": null, 00:13:01.540 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:01.540 "is_configured": false, 00:13:01.540 "data_offset": 2048, 00:13:01.540 "data_size": 63488 00:13:01.540 }, 00:13:01.540 { 00:13:01.540 "name": "BaseBdev3", 00:13:01.540 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:01.540 "is_configured": true, 00:13:01.540 "data_offset": 2048, 00:13:01.540 "data_size": 63488 00:13:01.540 } 00:13:01.540 ] 00:13:01.540 }' 00:13:01.540 13:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.540 13:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:02.106 13:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.106 13:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:02.364 [2024-07-15 13:35:49.914593] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:02.364 BaseBdev1 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:02.364 13:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:02.621 13:35:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:02.878 [ 00:13:02.878 { 00:13:02.878 "name": "BaseBdev1", 00:13:02.878 "aliases": [ 00:13:02.878 "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f" 00:13:02.878 ], 00:13:02.878 "product_name": "Malloc disk", 00:13:02.878 "block_size": 512, 00:13:02.878 "num_blocks": 65536, 00:13:02.878 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:02.878 "assigned_rate_limits": { 00:13:02.878 "rw_ios_per_sec": 0, 00:13:02.878 "rw_mbytes_per_sec": 0, 00:13:02.878 "r_mbytes_per_sec": 0, 00:13:02.878 "w_mbytes_per_sec": 0 00:13:02.878 }, 00:13:02.878 "claimed": true, 00:13:02.878 "claim_type": "exclusive_write", 00:13:02.878 "zoned": false, 00:13:02.878 "supported_io_types": { 00:13:02.878 "read": true, 00:13:02.878 "write": true, 00:13:02.878 "unmap": true, 00:13:02.878 "flush": true, 00:13:02.878 "reset": true, 00:13:02.878 "nvme_admin": false, 00:13:02.878 "nvme_io": false, 00:13:02.879 "nvme_io_md": false, 00:13:02.879 "write_zeroes": true, 00:13:02.879 "zcopy": true, 00:13:02.879 "get_zone_info": false, 00:13:02.879 "zone_management": false, 00:13:02.879 "zone_append": false, 00:13:02.879 "compare": false, 00:13:02.879 "compare_and_write": false, 00:13:02.879 "abort": true, 00:13:02.879 "seek_hole": false, 00:13:02.879 "seek_data": false, 00:13:02.879 "copy": true, 00:13:02.879 "nvme_iov_md": false 00:13:02.879 }, 00:13:02.879 "memory_domains": [ 00:13:02.879 { 00:13:02.879 "dma_device_id": "system", 00:13:02.879 "dma_device_type": 1 00:13:02.879 }, 00:13:02.879 { 00:13:02.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.879 "dma_device_type": 2 00:13:02.879 } 00:13:02.879 ], 00:13:02.879 "driver_specific": {} 00:13:02.879 } 00:13:02.879 ] 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.879 "name": "Existed_Raid", 00:13:02.879 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:02.879 "strip_size_kb": 64, 00:13:02.879 "state": "configuring", 00:13:02.879 "raid_level": "concat", 00:13:02.879 "superblock": true, 00:13:02.879 "num_base_bdevs": 3, 00:13:02.879 "num_base_bdevs_discovered": 2, 00:13:02.879 "num_base_bdevs_operational": 3, 00:13:02.879 "base_bdevs_list": [ 00:13:02.879 { 00:13:02.879 "name": "BaseBdev1", 00:13:02.879 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:02.879 "is_configured": true, 00:13:02.879 "data_offset": 2048, 00:13:02.879 "data_size": 63488 00:13:02.879 }, 00:13:02.879 { 00:13:02.879 "name": null, 00:13:02.879 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:02.879 "is_configured": false, 00:13:02.879 "data_offset": 2048, 00:13:02.879 "data_size": 63488 00:13:02.879 }, 00:13:02.879 { 00:13:02.879 "name": "BaseBdev3", 00:13:02.879 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:02.879 "is_configured": true, 00:13:02.879 "data_offset": 2048, 00:13:02.879 "data_size": 63488 00:13:02.879 } 00:13:02.879 ] 00:13:02.879 }' 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.879 13:35:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:03.443 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.443 13:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:03.700 [2024-07-15 13:35:51.298178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.700 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.957 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.957 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.957 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.957 "name": "Existed_Raid", 00:13:03.957 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:03.957 "strip_size_kb": 64, 00:13:03.957 "state": "configuring", 00:13:03.957 "raid_level": "concat", 00:13:03.957 "superblock": true, 00:13:03.957 "num_base_bdevs": 3, 00:13:03.957 "num_base_bdevs_discovered": 1, 00:13:03.957 "num_base_bdevs_operational": 3, 00:13:03.957 "base_bdevs_list": [ 00:13:03.957 { 00:13:03.957 "name": "BaseBdev1", 00:13:03.957 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:03.957 "is_configured": true, 00:13:03.957 "data_offset": 2048, 00:13:03.957 "data_size": 63488 00:13:03.957 }, 00:13:03.957 { 00:13:03.957 "name": null, 00:13:03.957 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:03.957 "is_configured": false, 00:13:03.957 "data_offset": 2048, 00:13:03.957 "data_size": 63488 00:13:03.957 }, 00:13:03.957 { 00:13:03.957 "name": null, 00:13:03.957 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:03.957 "is_configured": false, 00:13:03.957 "data_offset": 2048, 00:13:03.957 "data_size": 63488 00:13:03.957 } 00:13:03.957 ] 00:13:03.957 }' 00:13:03.957 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.957 13:35:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:04.521 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.521 13:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:04.778 [2024-07-15 13:35:52.320834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.778 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.035 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.035 "name": "Existed_Raid", 00:13:05.035 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:05.035 "strip_size_kb": 64, 00:13:05.035 "state": "configuring", 00:13:05.035 "raid_level": "concat", 00:13:05.035 "superblock": true, 00:13:05.035 "num_base_bdevs": 3, 00:13:05.035 "num_base_bdevs_discovered": 2, 00:13:05.035 "num_base_bdevs_operational": 3, 00:13:05.035 "base_bdevs_list": [ 00:13:05.035 { 00:13:05.035 "name": "BaseBdev1", 00:13:05.035 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:05.035 "is_configured": true, 00:13:05.035 "data_offset": 2048, 00:13:05.035 "data_size": 63488 00:13:05.035 }, 00:13:05.035 { 00:13:05.035 "name": null, 00:13:05.035 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:05.035 "is_configured": false, 00:13:05.035 "data_offset": 2048, 00:13:05.035 "data_size": 63488 00:13:05.035 }, 00:13:05.035 { 00:13:05.035 "name": "BaseBdev3", 00:13:05.035 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:05.035 "is_configured": true, 00:13:05.035 "data_offset": 2048, 00:13:05.035 "data_size": 63488 00:13:05.035 } 00:13:05.035 ] 00:13:05.035 }' 00:13:05.035 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.035 13:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:05.598 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.598 13:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:05.598 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:05.598 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:05.855 [2024-07-15 13:35:53.335474] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.855 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:06.112 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.112 "name": "Existed_Raid", 00:13:06.112 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:06.112 "strip_size_kb": 64, 00:13:06.112 "state": "configuring", 00:13:06.112 "raid_level": "concat", 00:13:06.112 "superblock": true, 00:13:06.112 "num_base_bdevs": 3, 00:13:06.112 "num_base_bdevs_discovered": 1, 00:13:06.112 "num_base_bdevs_operational": 3, 00:13:06.112 "base_bdevs_list": [ 00:13:06.112 { 00:13:06.112 "name": null, 00:13:06.112 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:06.112 "is_configured": false, 00:13:06.112 "data_offset": 2048, 00:13:06.112 "data_size": 63488 00:13:06.112 }, 00:13:06.112 { 00:13:06.112 "name": null, 00:13:06.112 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:06.112 "is_configured": false, 00:13:06.112 "data_offset": 2048, 00:13:06.112 "data_size": 63488 00:13:06.112 }, 00:13:06.112 { 00:13:06.112 "name": "BaseBdev3", 00:13:06.112 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:06.112 "is_configured": true, 00:13:06.112 "data_offset": 2048, 00:13:06.112 "data_size": 63488 00:13:06.112 } 00:13:06.112 ] 00:13:06.112 }' 00:13:06.112 13:35:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.112 13:35:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:06.676 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.676 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:06.676 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:06.676 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:06.934 [2024-07-15 13:35:54.344784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.934 "name": "Existed_Raid", 00:13:06.934 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:06.934 "strip_size_kb": 64, 00:13:06.934 "state": "configuring", 00:13:06.934 "raid_level": "concat", 00:13:06.934 "superblock": true, 00:13:06.934 "num_base_bdevs": 3, 00:13:06.934 "num_base_bdevs_discovered": 2, 00:13:06.934 "num_base_bdevs_operational": 3, 00:13:06.934 "base_bdevs_list": [ 00:13:06.934 { 00:13:06.934 "name": null, 00:13:06.934 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:06.934 "is_configured": false, 00:13:06.934 "data_offset": 2048, 00:13:06.934 "data_size": 63488 00:13:06.934 }, 00:13:06.934 { 00:13:06.934 "name": "BaseBdev2", 00:13:06.934 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:06.934 "is_configured": true, 00:13:06.934 "data_offset": 2048, 00:13:06.934 "data_size": 63488 00:13:06.934 }, 00:13:06.934 { 00:13:06.934 "name": "BaseBdev3", 00:13:06.934 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:06.934 "is_configured": true, 00:13:06.934 "data_offset": 2048, 00:13:06.934 "data_size": 63488 00:13:06.934 } 00:13:06.934 ] 00:13:06.934 }' 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.934 13:35:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:07.498 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.498 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:07.755 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:07.755 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.755 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0ad66f05-5689-40a0-895d-2fe2dcd6ec0f 00:13:08.012 [2024-07-15 13:35:55.570778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:08.012 [2024-07-15 13:35:55.570898] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9a8f70 00:13:08.012 [2024-07-15 13:35:55.570907] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:08.012 [2024-07-15 13:35:55.571042] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x805cc0 00:13:08.012 [2024-07-15 13:35:55.571121] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9a8f70 00:13:08.012 [2024-07-15 13:35:55.571128] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9a8f70 00:13:08.012 [2024-07-15 13:35:55.571190] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:08.012 NewBaseBdev 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:08.012 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:08.269 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:08.558 [ 00:13:08.558 { 00:13:08.558 "name": "NewBaseBdev", 00:13:08.558 "aliases": [ 00:13:08.558 "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f" 00:13:08.558 ], 00:13:08.558 "product_name": "Malloc disk", 00:13:08.558 "block_size": 512, 00:13:08.558 "num_blocks": 65536, 00:13:08.558 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:08.558 "assigned_rate_limits": { 00:13:08.558 "rw_ios_per_sec": 0, 00:13:08.558 "rw_mbytes_per_sec": 0, 00:13:08.558 "r_mbytes_per_sec": 0, 00:13:08.558 "w_mbytes_per_sec": 0 00:13:08.558 }, 00:13:08.558 "claimed": true, 00:13:08.558 "claim_type": "exclusive_write", 00:13:08.558 "zoned": false, 00:13:08.558 "supported_io_types": { 00:13:08.558 "read": true, 00:13:08.558 "write": true, 00:13:08.558 "unmap": true, 00:13:08.558 "flush": true, 00:13:08.558 "reset": true, 00:13:08.558 "nvme_admin": false, 00:13:08.558 "nvme_io": false, 00:13:08.558 "nvme_io_md": false, 00:13:08.558 "write_zeroes": true, 00:13:08.558 "zcopy": true, 00:13:08.558 "get_zone_info": false, 00:13:08.558 "zone_management": false, 00:13:08.558 "zone_append": false, 00:13:08.558 "compare": false, 00:13:08.558 "compare_and_write": false, 00:13:08.558 "abort": true, 00:13:08.558 "seek_hole": false, 00:13:08.558 "seek_data": false, 00:13:08.558 "copy": true, 00:13:08.558 "nvme_iov_md": false 00:13:08.558 }, 00:13:08.558 "memory_domains": [ 00:13:08.558 { 00:13:08.558 "dma_device_id": "system", 00:13:08.558 "dma_device_type": 1 00:13:08.558 }, 00:13:08.558 { 00:13:08.558 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.558 "dma_device_type": 2 00:13:08.558 } 00:13:08.558 ], 00:13:08.558 "driver_specific": {} 00:13:08.558 } 00:13:08.558 ] 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.558 13:35:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.558 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.558 "name": "Existed_Raid", 00:13:08.558 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:08.558 "strip_size_kb": 64, 00:13:08.558 "state": "online", 00:13:08.558 "raid_level": "concat", 00:13:08.558 "superblock": true, 00:13:08.558 "num_base_bdevs": 3, 00:13:08.558 "num_base_bdevs_discovered": 3, 00:13:08.558 "num_base_bdevs_operational": 3, 00:13:08.558 "base_bdevs_list": [ 00:13:08.558 { 00:13:08.558 "name": "NewBaseBdev", 00:13:08.558 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:08.558 "is_configured": true, 00:13:08.559 "data_offset": 2048, 00:13:08.559 "data_size": 63488 00:13:08.559 }, 00:13:08.559 { 00:13:08.559 "name": "BaseBdev2", 00:13:08.559 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:08.559 "is_configured": true, 00:13:08.559 "data_offset": 2048, 00:13:08.559 "data_size": 63488 00:13:08.559 }, 00:13:08.559 { 00:13:08.559 "name": "BaseBdev3", 00:13:08.559 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:08.559 "is_configured": true, 00:13:08.559 "data_offset": 2048, 00:13:08.559 "data_size": 63488 00:13:08.559 } 00:13:08.559 ] 00:13:08.559 }' 00:13:08.559 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.559 13:35:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.146 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:09.147 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:09.147 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:09.147 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:09.147 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:09.147 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:09.147 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:09.147 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:09.404 [2024-07-15 13:35:56.782125] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:09.404 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:09.404 "name": "Existed_Raid", 00:13:09.404 "aliases": [ 00:13:09.404 "e5e29722-60d9-447a-8f8d-f356c102b4f0" 00:13:09.404 ], 00:13:09.404 "product_name": "Raid Volume", 00:13:09.404 "block_size": 512, 00:13:09.404 "num_blocks": 190464, 00:13:09.404 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:09.404 "assigned_rate_limits": { 00:13:09.404 "rw_ios_per_sec": 0, 00:13:09.404 "rw_mbytes_per_sec": 0, 00:13:09.404 "r_mbytes_per_sec": 0, 00:13:09.405 "w_mbytes_per_sec": 0 00:13:09.405 }, 00:13:09.405 "claimed": false, 00:13:09.405 "zoned": false, 00:13:09.405 "supported_io_types": { 00:13:09.405 "read": true, 00:13:09.405 "write": true, 00:13:09.405 "unmap": true, 00:13:09.405 "flush": true, 00:13:09.405 "reset": true, 00:13:09.405 "nvme_admin": false, 00:13:09.405 "nvme_io": false, 00:13:09.405 "nvme_io_md": false, 00:13:09.405 "write_zeroes": true, 00:13:09.405 "zcopy": false, 00:13:09.405 "get_zone_info": false, 00:13:09.405 "zone_management": false, 00:13:09.405 "zone_append": false, 00:13:09.405 "compare": false, 00:13:09.405 "compare_and_write": false, 00:13:09.405 "abort": false, 00:13:09.405 "seek_hole": false, 00:13:09.405 "seek_data": false, 00:13:09.405 "copy": false, 00:13:09.405 "nvme_iov_md": false 00:13:09.405 }, 00:13:09.405 "memory_domains": [ 00:13:09.405 { 00:13:09.405 "dma_device_id": "system", 00:13:09.405 "dma_device_type": 1 00:13:09.405 }, 00:13:09.405 { 00:13:09.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.405 "dma_device_type": 2 00:13:09.405 }, 00:13:09.405 { 00:13:09.405 "dma_device_id": "system", 00:13:09.405 "dma_device_type": 1 00:13:09.405 }, 00:13:09.405 { 00:13:09.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.405 "dma_device_type": 2 00:13:09.405 }, 00:13:09.405 { 00:13:09.405 "dma_device_id": "system", 00:13:09.405 "dma_device_type": 1 00:13:09.405 }, 00:13:09.405 { 00:13:09.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.405 "dma_device_type": 2 00:13:09.405 } 00:13:09.405 ], 00:13:09.405 "driver_specific": { 00:13:09.405 "raid": { 00:13:09.405 "uuid": "e5e29722-60d9-447a-8f8d-f356c102b4f0", 00:13:09.405 "strip_size_kb": 64, 00:13:09.405 "state": "online", 00:13:09.405 "raid_level": "concat", 00:13:09.405 "superblock": true, 00:13:09.405 "num_base_bdevs": 3, 00:13:09.405 "num_base_bdevs_discovered": 3, 00:13:09.405 "num_base_bdevs_operational": 3, 00:13:09.405 "base_bdevs_list": [ 00:13:09.405 { 00:13:09.405 "name": "NewBaseBdev", 00:13:09.405 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:09.405 "is_configured": true, 00:13:09.405 "data_offset": 2048, 00:13:09.405 "data_size": 63488 00:13:09.405 }, 00:13:09.405 { 00:13:09.405 "name": "BaseBdev2", 00:13:09.405 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:09.405 "is_configured": true, 00:13:09.405 "data_offset": 2048, 00:13:09.405 "data_size": 63488 00:13:09.405 }, 00:13:09.405 { 00:13:09.405 "name": "BaseBdev3", 00:13:09.405 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:09.405 "is_configured": true, 00:13:09.405 "data_offset": 2048, 00:13:09.405 "data_size": 63488 00:13:09.405 } 00:13:09.405 ] 00:13:09.405 } 00:13:09.405 } 00:13:09.405 }' 00:13:09.405 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:09.405 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:09.405 BaseBdev2 00:13:09.405 BaseBdev3' 00:13:09.405 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:09.405 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:09.405 13:35:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:09.664 "name": "NewBaseBdev", 00:13:09.664 "aliases": [ 00:13:09.664 "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f" 00:13:09.664 ], 00:13:09.664 "product_name": "Malloc disk", 00:13:09.664 "block_size": 512, 00:13:09.664 "num_blocks": 65536, 00:13:09.664 "uuid": "0ad66f05-5689-40a0-895d-2fe2dcd6ec0f", 00:13:09.664 "assigned_rate_limits": { 00:13:09.664 "rw_ios_per_sec": 0, 00:13:09.664 "rw_mbytes_per_sec": 0, 00:13:09.664 "r_mbytes_per_sec": 0, 00:13:09.664 "w_mbytes_per_sec": 0 00:13:09.664 }, 00:13:09.664 "claimed": true, 00:13:09.664 "claim_type": "exclusive_write", 00:13:09.664 "zoned": false, 00:13:09.664 "supported_io_types": { 00:13:09.664 "read": true, 00:13:09.664 "write": true, 00:13:09.664 "unmap": true, 00:13:09.664 "flush": true, 00:13:09.664 "reset": true, 00:13:09.664 "nvme_admin": false, 00:13:09.664 "nvme_io": false, 00:13:09.664 "nvme_io_md": false, 00:13:09.664 "write_zeroes": true, 00:13:09.664 "zcopy": true, 00:13:09.664 "get_zone_info": false, 00:13:09.664 "zone_management": false, 00:13:09.664 "zone_append": false, 00:13:09.664 "compare": false, 00:13:09.664 "compare_and_write": false, 00:13:09.664 "abort": true, 00:13:09.664 "seek_hole": false, 00:13:09.664 "seek_data": false, 00:13:09.664 "copy": true, 00:13:09.664 "nvme_iov_md": false 00:13:09.664 }, 00:13:09.664 "memory_domains": [ 00:13:09.664 { 00:13:09.664 "dma_device_id": "system", 00:13:09.664 "dma_device_type": 1 00:13:09.664 }, 00:13:09.664 { 00:13:09.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.664 "dma_device_type": 2 00:13:09.664 } 00:13:09.664 ], 00:13:09.664 "driver_specific": {} 00:13:09.664 }' 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:09.664 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:09.923 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:09.923 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:09.923 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:09.923 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:09.923 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:09.923 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:09.923 "name": "BaseBdev2", 00:13:09.923 "aliases": [ 00:13:09.923 "27b9491c-ddf0-4d13-9703-46887c5597f8" 00:13:09.923 ], 00:13:09.923 "product_name": "Malloc disk", 00:13:09.923 "block_size": 512, 00:13:09.923 "num_blocks": 65536, 00:13:09.923 "uuid": "27b9491c-ddf0-4d13-9703-46887c5597f8", 00:13:09.923 "assigned_rate_limits": { 00:13:09.923 "rw_ios_per_sec": 0, 00:13:09.923 "rw_mbytes_per_sec": 0, 00:13:09.923 "r_mbytes_per_sec": 0, 00:13:09.923 "w_mbytes_per_sec": 0 00:13:09.923 }, 00:13:09.923 "claimed": true, 00:13:09.923 "claim_type": "exclusive_write", 00:13:09.923 "zoned": false, 00:13:09.923 "supported_io_types": { 00:13:09.923 "read": true, 00:13:09.923 "write": true, 00:13:09.923 "unmap": true, 00:13:09.923 "flush": true, 00:13:09.923 "reset": true, 00:13:09.923 "nvme_admin": false, 00:13:09.923 "nvme_io": false, 00:13:09.923 "nvme_io_md": false, 00:13:09.923 "write_zeroes": true, 00:13:09.923 "zcopy": true, 00:13:09.923 "get_zone_info": false, 00:13:09.923 "zone_management": false, 00:13:09.923 "zone_append": false, 00:13:09.923 "compare": false, 00:13:09.923 "compare_and_write": false, 00:13:09.923 "abort": true, 00:13:09.923 "seek_hole": false, 00:13:09.923 "seek_data": false, 00:13:09.923 "copy": true, 00:13:09.923 "nvme_iov_md": false 00:13:09.923 }, 00:13:09.923 "memory_domains": [ 00:13:09.923 { 00:13:09.923 "dma_device_id": "system", 00:13:09.923 "dma_device_type": 1 00:13:09.923 }, 00:13:09.923 { 00:13:09.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.923 "dma_device_type": 2 00:13:09.923 } 00:13:09.923 ], 00:13:09.923 "driver_specific": {} 00:13:09.923 }' 00:13:09.923 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:10.180 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:10.438 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:10.438 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:10.438 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:10.438 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:10.438 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:10.438 "name": "BaseBdev3", 00:13:10.438 "aliases": [ 00:13:10.438 "50f9a9c1-54aa-458a-91b6-8806b7fac31c" 00:13:10.438 ], 00:13:10.438 "product_name": "Malloc disk", 00:13:10.438 "block_size": 512, 00:13:10.438 "num_blocks": 65536, 00:13:10.438 "uuid": "50f9a9c1-54aa-458a-91b6-8806b7fac31c", 00:13:10.438 "assigned_rate_limits": { 00:13:10.438 "rw_ios_per_sec": 0, 00:13:10.438 "rw_mbytes_per_sec": 0, 00:13:10.438 "r_mbytes_per_sec": 0, 00:13:10.438 "w_mbytes_per_sec": 0 00:13:10.438 }, 00:13:10.438 "claimed": true, 00:13:10.438 "claim_type": "exclusive_write", 00:13:10.438 "zoned": false, 00:13:10.438 "supported_io_types": { 00:13:10.438 "read": true, 00:13:10.438 "write": true, 00:13:10.438 "unmap": true, 00:13:10.438 "flush": true, 00:13:10.438 "reset": true, 00:13:10.438 "nvme_admin": false, 00:13:10.438 "nvme_io": false, 00:13:10.438 "nvme_io_md": false, 00:13:10.438 "write_zeroes": true, 00:13:10.438 "zcopy": true, 00:13:10.438 "get_zone_info": false, 00:13:10.438 "zone_management": false, 00:13:10.438 "zone_append": false, 00:13:10.438 "compare": false, 00:13:10.438 "compare_and_write": false, 00:13:10.438 "abort": true, 00:13:10.438 "seek_hole": false, 00:13:10.438 "seek_data": false, 00:13:10.438 "copy": true, 00:13:10.438 "nvme_iov_md": false 00:13:10.438 }, 00:13:10.438 "memory_domains": [ 00:13:10.438 { 00:13:10.438 "dma_device_id": "system", 00:13:10.438 "dma_device_type": 1 00:13:10.438 }, 00:13:10.438 { 00:13:10.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.438 "dma_device_type": 2 00:13:10.438 } 00:13:10.438 ], 00:13:10.438 "driver_specific": {} 00:13:10.438 }' 00:13:10.438 13:35:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.438 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:10.696 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:10.953 [2024-07-15 13:35:58.466413] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:10.953 [2024-07-15 13:35:58.466433] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:10.953 [2024-07-15 13:35:58.466471] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:10.953 [2024-07-15 13:35:58.466510] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:10.953 [2024-07-15 13:35:58.466518] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a8f70 name Existed_Raid, state offline 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2393 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2393 ']' 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2393 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2393 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2393' 00:13:10.953 killing process with pid 2393 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2393 00:13:10.953 [2024-07-15 13:35:58.533936] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:10.953 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2393 00:13:10.953 [2024-07-15 13:35:58.559025] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:11.211 13:35:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:11.211 00:13:11.211 real 0m21.882s 00:13:11.211 user 0m39.816s 00:13:11.211 sys 0m4.309s 00:13:11.211 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:11.211 13:35:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.211 ************************************ 00:13:11.211 END TEST raid_state_function_test_sb 00:13:11.211 ************************************ 00:13:11.211 13:35:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:11.211 13:35:58 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:13:11.211 13:35:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:11.211 13:35:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:11.211 13:35:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:11.211 ************************************ 00:13:11.211 START TEST raid_superblock_test 00:13:11.211 ************************************ 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=6061 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 6061 /var/tmp/spdk-raid.sock 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 6061 ']' 00:13:11.211 13:35:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:11.212 13:35:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:11.212 13:35:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:11.212 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:11.212 13:35:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:11.212 13:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:11.212 13:35:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.469 [2024-07-15 13:35:58.875275] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:13:11.469 [2024-07-15 13:35:58.875327] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid6061 ] 00:13:11.469 [2024-07-15 13:35:58.962074] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:11.469 [2024-07-15 13:35:59.052908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:11.726 [2024-07-15 13:35:59.112480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:11.726 [2024-07-15 13:35:59.112504] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:12.291 malloc1 00:13:12.291 13:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:12.549 [2024-07-15 13:36:00.006544] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:12.549 [2024-07-15 13:36:00.006587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:12.549 [2024-07-15 13:36:00.006619] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3c260 00:13:12.549 [2024-07-15 13:36:00.006629] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:12.549 [2024-07-15 13:36:00.008062] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:12.549 [2024-07-15 13:36:00.008089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:12.549 pt1 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:12.549 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:12.807 malloc2 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:12.807 [2024-07-15 13:36:00.363377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:12.807 [2024-07-15 13:36:00.363414] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:12.807 [2024-07-15 13:36:00.363442] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de6310 00:13:12.807 [2024-07-15 13:36:00.363450] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:12.807 [2024-07-15 13:36:00.364588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:12.807 [2024-07-15 13:36:00.364611] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:12.807 pt2 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:12.807 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:13.064 malloc3 00:13:13.064 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:13.322 [2024-07-15 13:36:00.720128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:13.322 [2024-07-15 13:36:00.720162] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:13.322 [2024-07-15 13:36:00.720191] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de9e70 00:13:13.322 [2024-07-15 13:36:00.720200] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:13.322 [2024-07-15 13:36:00.721380] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:13.322 [2024-07-15 13:36:00.721402] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:13.322 pt3 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:13.322 [2024-07-15 13:36:00.896618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:13.322 [2024-07-15 13:36:00.897605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:13.322 [2024-07-15 13:36:00.897646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:13.322 [2024-07-15 13:36:00.897751] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1deae80 00:13:13.322 [2024-07-15 13:36:00.897759] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:13.322 [2024-07-15 13:36:00.897904] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1de5490 00:13:13.322 [2024-07-15 13:36:00.898012] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1deae80 00:13:13.322 [2024-07-15 13:36:00.898020] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1deae80 00:13:13.322 [2024-07-15 13:36:00.898090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.322 13:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:13.580 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.580 "name": "raid_bdev1", 00:13:13.580 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:13.580 "strip_size_kb": 64, 00:13:13.580 "state": "online", 00:13:13.580 "raid_level": "concat", 00:13:13.580 "superblock": true, 00:13:13.580 "num_base_bdevs": 3, 00:13:13.580 "num_base_bdevs_discovered": 3, 00:13:13.580 "num_base_bdevs_operational": 3, 00:13:13.580 "base_bdevs_list": [ 00:13:13.580 { 00:13:13.580 "name": "pt1", 00:13:13.580 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:13.580 "is_configured": true, 00:13:13.580 "data_offset": 2048, 00:13:13.580 "data_size": 63488 00:13:13.580 }, 00:13:13.580 { 00:13:13.580 "name": "pt2", 00:13:13.580 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:13.580 "is_configured": true, 00:13:13.580 "data_offset": 2048, 00:13:13.580 "data_size": 63488 00:13:13.580 }, 00:13:13.580 { 00:13:13.580 "name": "pt3", 00:13:13.580 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:13.580 "is_configured": true, 00:13:13.580 "data_offset": 2048, 00:13:13.580 "data_size": 63488 00:13:13.580 } 00:13:13.580 ] 00:13:13.580 }' 00:13:13.580 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.580 13:36:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:14.145 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:14.145 [2024-07-15 13:36:01.763111] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.403 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:14.403 "name": "raid_bdev1", 00:13:14.403 "aliases": [ 00:13:14.403 "62d285e1-5e24-4ad4-a0ce-be93d9977b35" 00:13:14.403 ], 00:13:14.403 "product_name": "Raid Volume", 00:13:14.403 "block_size": 512, 00:13:14.403 "num_blocks": 190464, 00:13:14.403 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:14.403 "assigned_rate_limits": { 00:13:14.403 "rw_ios_per_sec": 0, 00:13:14.403 "rw_mbytes_per_sec": 0, 00:13:14.403 "r_mbytes_per_sec": 0, 00:13:14.403 "w_mbytes_per_sec": 0 00:13:14.403 }, 00:13:14.403 "claimed": false, 00:13:14.403 "zoned": false, 00:13:14.403 "supported_io_types": { 00:13:14.403 "read": true, 00:13:14.403 "write": true, 00:13:14.403 "unmap": true, 00:13:14.403 "flush": true, 00:13:14.403 "reset": true, 00:13:14.403 "nvme_admin": false, 00:13:14.403 "nvme_io": false, 00:13:14.403 "nvme_io_md": false, 00:13:14.403 "write_zeroes": true, 00:13:14.403 "zcopy": false, 00:13:14.403 "get_zone_info": false, 00:13:14.403 "zone_management": false, 00:13:14.403 "zone_append": false, 00:13:14.403 "compare": false, 00:13:14.403 "compare_and_write": false, 00:13:14.403 "abort": false, 00:13:14.403 "seek_hole": false, 00:13:14.403 "seek_data": false, 00:13:14.403 "copy": false, 00:13:14.403 "nvme_iov_md": false 00:13:14.403 }, 00:13:14.403 "memory_domains": [ 00:13:14.403 { 00:13:14.403 "dma_device_id": "system", 00:13:14.403 "dma_device_type": 1 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.403 "dma_device_type": 2 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "dma_device_id": "system", 00:13:14.403 "dma_device_type": 1 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.403 "dma_device_type": 2 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "dma_device_id": "system", 00:13:14.403 "dma_device_type": 1 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.403 "dma_device_type": 2 00:13:14.403 } 00:13:14.403 ], 00:13:14.403 "driver_specific": { 00:13:14.403 "raid": { 00:13:14.403 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:14.403 "strip_size_kb": 64, 00:13:14.403 "state": "online", 00:13:14.403 "raid_level": "concat", 00:13:14.403 "superblock": true, 00:13:14.403 "num_base_bdevs": 3, 00:13:14.403 "num_base_bdevs_discovered": 3, 00:13:14.403 "num_base_bdevs_operational": 3, 00:13:14.403 "base_bdevs_list": [ 00:13:14.403 { 00:13:14.403 "name": "pt1", 00:13:14.403 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:14.403 "is_configured": true, 00:13:14.403 "data_offset": 2048, 00:13:14.403 "data_size": 63488 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "name": "pt2", 00:13:14.403 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:14.403 "is_configured": true, 00:13:14.403 "data_offset": 2048, 00:13:14.403 "data_size": 63488 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "name": "pt3", 00:13:14.403 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:14.403 "is_configured": true, 00:13:14.403 "data_offset": 2048, 00:13:14.403 "data_size": 63488 00:13:14.403 } 00:13:14.403 ] 00:13:14.403 } 00:13:14.403 } 00:13:14.403 }' 00:13:14.403 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:14.403 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:14.403 pt2 00:13:14.403 pt3' 00:13:14.403 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.403 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:14.403 13:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.403 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.403 "name": "pt1", 00:13:14.403 "aliases": [ 00:13:14.403 "00000000-0000-0000-0000-000000000001" 00:13:14.403 ], 00:13:14.403 "product_name": "passthru", 00:13:14.403 "block_size": 512, 00:13:14.403 "num_blocks": 65536, 00:13:14.403 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:14.403 "assigned_rate_limits": { 00:13:14.403 "rw_ios_per_sec": 0, 00:13:14.403 "rw_mbytes_per_sec": 0, 00:13:14.403 "r_mbytes_per_sec": 0, 00:13:14.403 "w_mbytes_per_sec": 0 00:13:14.403 }, 00:13:14.403 "claimed": true, 00:13:14.403 "claim_type": "exclusive_write", 00:13:14.403 "zoned": false, 00:13:14.403 "supported_io_types": { 00:13:14.403 "read": true, 00:13:14.403 "write": true, 00:13:14.403 "unmap": true, 00:13:14.403 "flush": true, 00:13:14.403 "reset": true, 00:13:14.403 "nvme_admin": false, 00:13:14.403 "nvme_io": false, 00:13:14.403 "nvme_io_md": false, 00:13:14.403 "write_zeroes": true, 00:13:14.403 "zcopy": true, 00:13:14.403 "get_zone_info": false, 00:13:14.403 "zone_management": false, 00:13:14.403 "zone_append": false, 00:13:14.403 "compare": false, 00:13:14.403 "compare_and_write": false, 00:13:14.403 "abort": true, 00:13:14.403 "seek_hole": false, 00:13:14.403 "seek_data": false, 00:13:14.403 "copy": true, 00:13:14.403 "nvme_iov_md": false 00:13:14.403 }, 00:13:14.403 "memory_domains": [ 00:13:14.403 { 00:13:14.403 "dma_device_id": "system", 00:13:14.403 "dma_device_type": 1 00:13:14.403 }, 00:13:14.403 { 00:13:14.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.403 "dma_device_type": 2 00:13:14.403 } 00:13:14.403 ], 00:13:14.403 "driver_specific": { 00:13:14.403 "passthru": { 00:13:14.403 "name": "pt1", 00:13:14.403 "base_bdev_name": "malloc1" 00:13:14.403 } 00:13:14.403 } 00:13:14.403 }' 00:13:14.403 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.660 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.918 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.918 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:14.918 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.918 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:14.918 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.918 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.918 "name": "pt2", 00:13:14.918 "aliases": [ 00:13:14.918 "00000000-0000-0000-0000-000000000002" 00:13:14.918 ], 00:13:14.918 "product_name": "passthru", 00:13:14.918 "block_size": 512, 00:13:14.918 "num_blocks": 65536, 00:13:14.918 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:14.918 "assigned_rate_limits": { 00:13:14.918 "rw_ios_per_sec": 0, 00:13:14.918 "rw_mbytes_per_sec": 0, 00:13:14.918 "r_mbytes_per_sec": 0, 00:13:14.918 "w_mbytes_per_sec": 0 00:13:14.918 }, 00:13:14.918 "claimed": true, 00:13:14.918 "claim_type": "exclusive_write", 00:13:14.918 "zoned": false, 00:13:14.918 "supported_io_types": { 00:13:14.918 "read": true, 00:13:14.918 "write": true, 00:13:14.918 "unmap": true, 00:13:14.918 "flush": true, 00:13:14.918 "reset": true, 00:13:14.918 "nvme_admin": false, 00:13:14.918 "nvme_io": false, 00:13:14.918 "nvme_io_md": false, 00:13:14.918 "write_zeroes": true, 00:13:14.918 "zcopy": true, 00:13:14.918 "get_zone_info": false, 00:13:14.918 "zone_management": false, 00:13:14.918 "zone_append": false, 00:13:14.918 "compare": false, 00:13:14.918 "compare_and_write": false, 00:13:14.918 "abort": true, 00:13:14.918 "seek_hole": false, 00:13:14.918 "seek_data": false, 00:13:14.918 "copy": true, 00:13:14.918 "nvme_iov_md": false 00:13:14.918 }, 00:13:14.918 "memory_domains": [ 00:13:14.918 { 00:13:14.918 "dma_device_id": "system", 00:13:14.918 "dma_device_type": 1 00:13:14.918 }, 00:13:14.918 { 00:13:14.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.918 "dma_device_type": 2 00:13:14.918 } 00:13:14.918 ], 00:13:14.918 "driver_specific": { 00:13:14.918 "passthru": { 00:13:14.918 "name": "pt2", 00:13:14.918 "base_bdev_name": "malloc2" 00:13:14.918 } 00:13:14.918 } 00:13:14.918 }' 00:13:14.918 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.176 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.433 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.433 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.433 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:15.433 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.433 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.433 "name": "pt3", 00:13:15.433 "aliases": [ 00:13:15.433 "00000000-0000-0000-0000-000000000003" 00:13:15.433 ], 00:13:15.433 "product_name": "passthru", 00:13:15.433 "block_size": 512, 00:13:15.433 "num_blocks": 65536, 00:13:15.433 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:15.433 "assigned_rate_limits": { 00:13:15.433 "rw_ios_per_sec": 0, 00:13:15.433 "rw_mbytes_per_sec": 0, 00:13:15.433 "r_mbytes_per_sec": 0, 00:13:15.433 "w_mbytes_per_sec": 0 00:13:15.433 }, 00:13:15.433 "claimed": true, 00:13:15.433 "claim_type": "exclusive_write", 00:13:15.433 "zoned": false, 00:13:15.433 "supported_io_types": { 00:13:15.433 "read": true, 00:13:15.433 "write": true, 00:13:15.433 "unmap": true, 00:13:15.433 "flush": true, 00:13:15.433 "reset": true, 00:13:15.433 "nvme_admin": false, 00:13:15.433 "nvme_io": false, 00:13:15.433 "nvme_io_md": false, 00:13:15.433 "write_zeroes": true, 00:13:15.433 "zcopy": true, 00:13:15.434 "get_zone_info": false, 00:13:15.434 "zone_management": false, 00:13:15.434 "zone_append": false, 00:13:15.434 "compare": false, 00:13:15.434 "compare_and_write": false, 00:13:15.434 "abort": true, 00:13:15.434 "seek_hole": false, 00:13:15.434 "seek_data": false, 00:13:15.434 "copy": true, 00:13:15.434 "nvme_iov_md": false 00:13:15.434 }, 00:13:15.434 "memory_domains": [ 00:13:15.434 { 00:13:15.434 "dma_device_id": "system", 00:13:15.434 "dma_device_type": 1 00:13:15.434 }, 00:13:15.434 { 00:13:15.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.434 "dma_device_type": 2 00:13:15.434 } 00:13:15.434 ], 00:13:15.434 "driver_specific": { 00:13:15.434 "passthru": { 00:13:15.434 "name": "pt3", 00:13:15.434 "base_bdev_name": "malloc3" 00:13:15.434 } 00:13:15.434 } 00:13:15.434 }' 00:13:15.434 13:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.434 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.693 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.693 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.693 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.693 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:15.694 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:15.950 [2024-07-15 13:36:03.439480] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:15.950 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=62d285e1-5e24-4ad4-a0ce-be93d9977b35 00:13:15.950 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 62d285e1-5e24-4ad4-a0ce-be93d9977b35 ']' 00:13:15.950 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:16.208 [2024-07-15 13:36:03.615775] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:16.208 [2024-07-15 13:36:03.615791] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:16.208 [2024-07-15 13:36:03.615826] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:16.208 [2024-07-15 13:36:03.615864] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:16.208 [2024-07-15 13:36:03.615872] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1deae80 name raid_bdev1, state offline 00:13:16.208 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.208 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:16.208 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:16.208 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:16.208 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:16.208 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:16.467 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:16.467 13:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:16.726 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:16.726 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:16.726 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:16.726 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:16.984 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:17.243 [2024-07-15 13:36:04.654444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:17.243 [2024-07-15 13:36:04.655461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:17.243 [2024-07-15 13:36:04.655492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:17.243 [2024-07-15 13:36:04.655524] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:17.243 [2024-07-15 13:36:04.655552] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:17.243 [2024-07-15 13:36:04.655583] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:17.243 [2024-07-15 13:36:04.655596] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:17.243 [2024-07-15 13:36:04.655608] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1de6540 name raid_bdev1, state configuring 00:13:17.243 request: 00:13:17.243 { 00:13:17.243 "name": "raid_bdev1", 00:13:17.243 "raid_level": "concat", 00:13:17.243 "base_bdevs": [ 00:13:17.243 "malloc1", 00:13:17.243 "malloc2", 00:13:17.243 "malloc3" 00:13:17.243 ], 00:13:17.243 "strip_size_kb": 64, 00:13:17.243 "superblock": false, 00:13:17.243 "method": "bdev_raid_create", 00:13:17.243 "req_id": 1 00:13:17.243 } 00:13:17.243 Got JSON-RPC error response 00:13:17.243 response: 00:13:17.243 { 00:13:17.244 "code": -17, 00:13:17.244 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:17.244 } 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:17.244 13:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:17.503 [2024-07-15 13:36:04.999303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:17.503 [2024-07-15 13:36:04.999341] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:17.503 [2024-07-15 13:36:04.999356] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3ce80 00:13:17.503 [2024-07-15 13:36:04.999365] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:17.503 [2024-07-15 13:36:05.000637] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:17.503 [2024-07-15 13:36:05.000660] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:17.503 [2024-07-15 13:36:05.000713] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:17.503 [2024-07-15 13:36:05.000735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:17.503 pt1 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.503 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:17.762 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.762 "name": "raid_bdev1", 00:13:17.762 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:17.762 "strip_size_kb": 64, 00:13:17.762 "state": "configuring", 00:13:17.762 "raid_level": "concat", 00:13:17.762 "superblock": true, 00:13:17.762 "num_base_bdevs": 3, 00:13:17.762 "num_base_bdevs_discovered": 1, 00:13:17.762 "num_base_bdevs_operational": 3, 00:13:17.762 "base_bdevs_list": [ 00:13:17.762 { 00:13:17.762 "name": "pt1", 00:13:17.762 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:17.762 "is_configured": true, 00:13:17.762 "data_offset": 2048, 00:13:17.762 "data_size": 63488 00:13:17.762 }, 00:13:17.762 { 00:13:17.762 "name": null, 00:13:17.762 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:17.762 "is_configured": false, 00:13:17.762 "data_offset": 2048, 00:13:17.762 "data_size": 63488 00:13:17.762 }, 00:13:17.762 { 00:13:17.762 "name": null, 00:13:17.762 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:17.762 "is_configured": false, 00:13:17.762 "data_offset": 2048, 00:13:17.762 "data_size": 63488 00:13:17.762 } 00:13:17.762 ] 00:13:17.762 }' 00:13:17.762 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.762 13:36:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.330 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:18.330 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:18.330 [2024-07-15 13:36:05.877574] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:18.330 [2024-07-15 13:36:05.877613] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:18.330 [2024-07-15 13:36:05.877644] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3c700 00:13:18.330 [2024-07-15 13:36:05.877653] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:18.330 [2024-07-15 13:36:05.877898] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:18.330 [2024-07-15 13:36:05.877910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:18.330 [2024-07-15 13:36:05.877958] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:18.330 [2024-07-15 13:36:05.877971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:18.330 pt2 00:13:18.330 13:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:18.588 [2024-07-15 13:36:06.058104] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.588 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:18.846 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.846 "name": "raid_bdev1", 00:13:18.846 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:18.846 "strip_size_kb": 64, 00:13:18.846 "state": "configuring", 00:13:18.846 "raid_level": "concat", 00:13:18.846 "superblock": true, 00:13:18.846 "num_base_bdevs": 3, 00:13:18.846 "num_base_bdevs_discovered": 1, 00:13:18.846 "num_base_bdevs_operational": 3, 00:13:18.846 "base_bdevs_list": [ 00:13:18.846 { 00:13:18.846 "name": "pt1", 00:13:18.846 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:18.846 "is_configured": true, 00:13:18.846 "data_offset": 2048, 00:13:18.846 "data_size": 63488 00:13:18.846 }, 00:13:18.846 { 00:13:18.846 "name": null, 00:13:18.846 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:18.846 "is_configured": false, 00:13:18.846 "data_offset": 2048, 00:13:18.846 "data_size": 63488 00:13:18.846 }, 00:13:18.846 { 00:13:18.846 "name": null, 00:13:18.846 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:18.846 "is_configured": false, 00:13:18.846 "data_offset": 2048, 00:13:18.846 "data_size": 63488 00:13:18.846 } 00:13:18.846 ] 00:13:18.846 }' 00:13:18.846 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.846 13:36:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.411 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:19.411 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:19.411 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:19.411 [2024-07-15 13:36:06.904241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:19.411 [2024-07-15 13:36:06.904283] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:19.411 [2024-07-15 13:36:06.904297] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de7720 00:13:19.411 [2024-07-15 13:36:06.904305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:19.411 [2024-07-15 13:36:06.904541] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:19.411 [2024-07-15 13:36:06.904552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:19.411 [2024-07-15 13:36:06.904600] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:19.411 [2024-07-15 13:36:06.904613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:19.411 pt2 00:13:19.411 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:19.411 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:19.411 13:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:19.669 [2024-07-15 13:36:07.076679] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:19.669 [2024-07-15 13:36:07.076703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:19.669 [2024-07-15 13:36:07.076714] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de8b10 00:13:19.669 [2024-07-15 13:36:07.076738] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:19.669 [2024-07-15 13:36:07.076937] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:19.669 [2024-07-15 13:36:07.076949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:19.669 [2024-07-15 13:36:07.076985] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:19.669 [2024-07-15 13:36:07.077007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:19.669 [2024-07-15 13:36:07.077080] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1de9410 00:13:19.669 [2024-07-15 13:36:07.077087] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:19.669 [2024-07-15 13:36:07.077211] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dea530 00:13:19.669 [2024-07-15 13:36:07.077291] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1de9410 00:13:19.669 [2024-07-15 13:36:07.077297] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1de9410 00:13:19.669 [2024-07-15 13:36:07.077359] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:19.669 pt3 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.669 "name": "raid_bdev1", 00:13:19.669 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:19.669 "strip_size_kb": 64, 00:13:19.669 "state": "online", 00:13:19.669 "raid_level": "concat", 00:13:19.669 "superblock": true, 00:13:19.669 "num_base_bdevs": 3, 00:13:19.669 "num_base_bdevs_discovered": 3, 00:13:19.669 "num_base_bdevs_operational": 3, 00:13:19.669 "base_bdevs_list": [ 00:13:19.669 { 00:13:19.669 "name": "pt1", 00:13:19.669 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:19.669 "is_configured": true, 00:13:19.669 "data_offset": 2048, 00:13:19.669 "data_size": 63488 00:13:19.669 }, 00:13:19.669 { 00:13:19.669 "name": "pt2", 00:13:19.669 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:19.669 "is_configured": true, 00:13:19.669 "data_offset": 2048, 00:13:19.669 "data_size": 63488 00:13:19.669 }, 00:13:19.669 { 00:13:19.669 "name": "pt3", 00:13:19.669 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:19.669 "is_configured": true, 00:13:19.669 "data_offset": 2048, 00:13:19.669 "data_size": 63488 00:13:19.669 } 00:13:19.669 ] 00:13:19.669 }' 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.669 13:36:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:20.236 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:20.494 [2024-07-15 13:36:07.943106] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:20.494 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:20.494 "name": "raid_bdev1", 00:13:20.494 "aliases": [ 00:13:20.494 "62d285e1-5e24-4ad4-a0ce-be93d9977b35" 00:13:20.494 ], 00:13:20.494 "product_name": "Raid Volume", 00:13:20.494 "block_size": 512, 00:13:20.494 "num_blocks": 190464, 00:13:20.494 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:20.494 "assigned_rate_limits": { 00:13:20.494 "rw_ios_per_sec": 0, 00:13:20.494 "rw_mbytes_per_sec": 0, 00:13:20.494 "r_mbytes_per_sec": 0, 00:13:20.494 "w_mbytes_per_sec": 0 00:13:20.494 }, 00:13:20.494 "claimed": false, 00:13:20.494 "zoned": false, 00:13:20.494 "supported_io_types": { 00:13:20.494 "read": true, 00:13:20.494 "write": true, 00:13:20.494 "unmap": true, 00:13:20.494 "flush": true, 00:13:20.494 "reset": true, 00:13:20.494 "nvme_admin": false, 00:13:20.494 "nvme_io": false, 00:13:20.494 "nvme_io_md": false, 00:13:20.494 "write_zeroes": true, 00:13:20.494 "zcopy": false, 00:13:20.494 "get_zone_info": false, 00:13:20.494 "zone_management": false, 00:13:20.494 "zone_append": false, 00:13:20.494 "compare": false, 00:13:20.494 "compare_and_write": false, 00:13:20.494 "abort": false, 00:13:20.494 "seek_hole": false, 00:13:20.494 "seek_data": false, 00:13:20.494 "copy": false, 00:13:20.494 "nvme_iov_md": false 00:13:20.494 }, 00:13:20.494 "memory_domains": [ 00:13:20.494 { 00:13:20.494 "dma_device_id": "system", 00:13:20.494 "dma_device_type": 1 00:13:20.494 }, 00:13:20.494 { 00:13:20.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.494 "dma_device_type": 2 00:13:20.494 }, 00:13:20.494 { 00:13:20.494 "dma_device_id": "system", 00:13:20.494 "dma_device_type": 1 00:13:20.494 }, 00:13:20.494 { 00:13:20.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.494 "dma_device_type": 2 00:13:20.494 }, 00:13:20.494 { 00:13:20.494 "dma_device_id": "system", 00:13:20.494 "dma_device_type": 1 00:13:20.494 }, 00:13:20.494 { 00:13:20.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.494 "dma_device_type": 2 00:13:20.494 } 00:13:20.494 ], 00:13:20.494 "driver_specific": { 00:13:20.494 "raid": { 00:13:20.494 "uuid": "62d285e1-5e24-4ad4-a0ce-be93d9977b35", 00:13:20.494 "strip_size_kb": 64, 00:13:20.494 "state": "online", 00:13:20.494 "raid_level": "concat", 00:13:20.494 "superblock": true, 00:13:20.494 "num_base_bdevs": 3, 00:13:20.494 "num_base_bdevs_discovered": 3, 00:13:20.494 "num_base_bdevs_operational": 3, 00:13:20.494 "base_bdevs_list": [ 00:13:20.494 { 00:13:20.494 "name": "pt1", 00:13:20.494 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.494 "is_configured": true, 00:13:20.494 "data_offset": 2048, 00:13:20.494 "data_size": 63488 00:13:20.494 }, 00:13:20.494 { 00:13:20.494 "name": "pt2", 00:13:20.494 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:20.495 "is_configured": true, 00:13:20.495 "data_offset": 2048, 00:13:20.495 "data_size": 63488 00:13:20.495 }, 00:13:20.495 { 00:13:20.495 "name": "pt3", 00:13:20.495 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:20.495 "is_configured": true, 00:13:20.495 "data_offset": 2048, 00:13:20.495 "data_size": 63488 00:13:20.495 } 00:13:20.495 ] 00:13:20.495 } 00:13:20.495 } 00:13:20.495 }' 00:13:20.495 13:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:20.495 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:20.495 pt2 00:13:20.495 pt3' 00:13:20.495 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:20.495 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:20.495 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:20.753 "name": "pt1", 00:13:20.753 "aliases": [ 00:13:20.753 "00000000-0000-0000-0000-000000000001" 00:13:20.753 ], 00:13:20.753 "product_name": "passthru", 00:13:20.753 "block_size": 512, 00:13:20.753 "num_blocks": 65536, 00:13:20.753 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.753 "assigned_rate_limits": { 00:13:20.753 "rw_ios_per_sec": 0, 00:13:20.753 "rw_mbytes_per_sec": 0, 00:13:20.753 "r_mbytes_per_sec": 0, 00:13:20.753 "w_mbytes_per_sec": 0 00:13:20.753 }, 00:13:20.753 "claimed": true, 00:13:20.753 "claim_type": "exclusive_write", 00:13:20.753 "zoned": false, 00:13:20.753 "supported_io_types": { 00:13:20.753 "read": true, 00:13:20.753 "write": true, 00:13:20.753 "unmap": true, 00:13:20.753 "flush": true, 00:13:20.753 "reset": true, 00:13:20.753 "nvme_admin": false, 00:13:20.753 "nvme_io": false, 00:13:20.753 "nvme_io_md": false, 00:13:20.753 "write_zeroes": true, 00:13:20.753 "zcopy": true, 00:13:20.753 "get_zone_info": false, 00:13:20.753 "zone_management": false, 00:13:20.753 "zone_append": false, 00:13:20.753 "compare": false, 00:13:20.753 "compare_and_write": false, 00:13:20.753 "abort": true, 00:13:20.753 "seek_hole": false, 00:13:20.753 "seek_data": false, 00:13:20.753 "copy": true, 00:13:20.753 "nvme_iov_md": false 00:13:20.753 }, 00:13:20.753 "memory_domains": [ 00:13:20.753 { 00:13:20.753 "dma_device_id": "system", 00:13:20.753 "dma_device_type": 1 00:13:20.753 }, 00:13:20.753 { 00:13:20.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.753 "dma_device_type": 2 00:13:20.753 } 00:13:20.753 ], 00:13:20.753 "driver_specific": { 00:13:20.753 "passthru": { 00:13:20.753 "name": "pt1", 00:13:20.753 "base_bdev_name": "malloc1" 00:13:20.753 } 00:13:20.753 } 00:13:20.753 }' 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:20.753 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:21.012 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:21.270 "name": "pt2", 00:13:21.270 "aliases": [ 00:13:21.270 "00000000-0000-0000-0000-000000000002" 00:13:21.270 ], 00:13:21.270 "product_name": "passthru", 00:13:21.270 "block_size": 512, 00:13:21.270 "num_blocks": 65536, 00:13:21.270 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:21.270 "assigned_rate_limits": { 00:13:21.270 "rw_ios_per_sec": 0, 00:13:21.270 "rw_mbytes_per_sec": 0, 00:13:21.270 "r_mbytes_per_sec": 0, 00:13:21.270 "w_mbytes_per_sec": 0 00:13:21.270 }, 00:13:21.270 "claimed": true, 00:13:21.270 "claim_type": "exclusive_write", 00:13:21.270 "zoned": false, 00:13:21.270 "supported_io_types": { 00:13:21.270 "read": true, 00:13:21.270 "write": true, 00:13:21.270 "unmap": true, 00:13:21.270 "flush": true, 00:13:21.270 "reset": true, 00:13:21.270 "nvme_admin": false, 00:13:21.270 "nvme_io": false, 00:13:21.270 "nvme_io_md": false, 00:13:21.270 "write_zeroes": true, 00:13:21.270 "zcopy": true, 00:13:21.270 "get_zone_info": false, 00:13:21.270 "zone_management": false, 00:13:21.270 "zone_append": false, 00:13:21.270 "compare": false, 00:13:21.270 "compare_and_write": false, 00:13:21.270 "abort": true, 00:13:21.270 "seek_hole": false, 00:13:21.270 "seek_data": false, 00:13:21.270 "copy": true, 00:13:21.270 "nvme_iov_md": false 00:13:21.270 }, 00:13:21.270 "memory_domains": [ 00:13:21.270 { 00:13:21.270 "dma_device_id": "system", 00:13:21.270 "dma_device_type": 1 00:13:21.270 }, 00:13:21.270 { 00:13:21.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.270 "dma_device_type": 2 00:13:21.270 } 00:13:21.270 ], 00:13:21.270 "driver_specific": { 00:13:21.270 "passthru": { 00:13:21.270 "name": "pt2", 00:13:21.270 "base_bdev_name": "malloc2" 00:13:21.270 } 00:13:21.270 } 00:13:21.270 }' 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.270 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.528 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:21.528 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.528 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.528 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:21.528 13:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:21.528 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:21.528 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:21.786 "name": "pt3", 00:13:21.786 "aliases": [ 00:13:21.786 "00000000-0000-0000-0000-000000000003" 00:13:21.786 ], 00:13:21.786 "product_name": "passthru", 00:13:21.786 "block_size": 512, 00:13:21.786 "num_blocks": 65536, 00:13:21.786 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:21.786 "assigned_rate_limits": { 00:13:21.786 "rw_ios_per_sec": 0, 00:13:21.786 "rw_mbytes_per_sec": 0, 00:13:21.786 "r_mbytes_per_sec": 0, 00:13:21.786 "w_mbytes_per_sec": 0 00:13:21.786 }, 00:13:21.786 "claimed": true, 00:13:21.786 "claim_type": "exclusive_write", 00:13:21.786 "zoned": false, 00:13:21.786 "supported_io_types": { 00:13:21.786 "read": true, 00:13:21.786 "write": true, 00:13:21.786 "unmap": true, 00:13:21.786 "flush": true, 00:13:21.786 "reset": true, 00:13:21.786 "nvme_admin": false, 00:13:21.786 "nvme_io": false, 00:13:21.786 "nvme_io_md": false, 00:13:21.786 "write_zeroes": true, 00:13:21.786 "zcopy": true, 00:13:21.786 "get_zone_info": false, 00:13:21.786 "zone_management": false, 00:13:21.786 "zone_append": false, 00:13:21.786 "compare": false, 00:13:21.786 "compare_and_write": false, 00:13:21.786 "abort": true, 00:13:21.786 "seek_hole": false, 00:13:21.786 "seek_data": false, 00:13:21.786 "copy": true, 00:13:21.786 "nvme_iov_md": false 00:13:21.786 }, 00:13:21.786 "memory_domains": [ 00:13:21.786 { 00:13:21.786 "dma_device_id": "system", 00:13:21.786 "dma_device_type": 1 00:13:21.786 }, 00:13:21.786 { 00:13:21.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.786 "dma_device_type": 2 00:13:21.786 } 00:13:21.786 ], 00:13:21.786 "driver_specific": { 00:13:21.786 "passthru": { 00:13:21.786 "name": "pt3", 00:13:21.786 "base_bdev_name": "malloc3" 00:13:21.786 } 00:13:21.786 } 00:13:21.786 }' 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.786 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:22.045 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:22.045 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:22.045 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:22.045 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:22.045 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:22.045 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:22.045 [2024-07-15 13:36:09.647494] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 62d285e1-5e24-4ad4-a0ce-be93d9977b35 '!=' 62d285e1-5e24-4ad4-a0ce-be93d9977b35 ']' 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 6061 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 6061 ']' 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 6061 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 6061 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 6061' 00:13:22.302 killing process with pid 6061 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 6061 00:13:22.302 [2024-07-15 13:36:09.713802] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:22.302 [2024-07-15 13:36:09.713844] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:22.302 [2024-07-15 13:36:09.713882] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:22.302 [2024-07-15 13:36:09.713890] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1de9410 name raid_bdev1, state offline 00:13:22.302 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 6061 00:13:22.302 [2024-07-15 13:36:09.742431] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:22.560 13:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:22.560 00:13:22.560 real 0m11.118s 00:13:22.560 user 0m19.783s 00:13:22.560 sys 0m2.204s 00:13:22.560 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:22.560 13:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.560 ************************************ 00:13:22.560 END TEST raid_superblock_test 00:13:22.560 ************************************ 00:13:22.560 13:36:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:22.560 13:36:09 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:13:22.560 13:36:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:22.560 13:36:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:22.560 13:36:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:22.560 ************************************ 00:13:22.560 START TEST raid_read_error_test 00:13:22.560 ************************************ 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.gZuzdEvTAN 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=8275 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 8275 /var/tmp/spdk-raid.sock 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 8275 ']' 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:22.560 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:22.560 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.560 [2024-07-15 13:36:10.102187] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:13:22.560 [2024-07-15 13:36:10.102240] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid8275 ] 00:13:22.818 [2024-07-15 13:36:10.190029] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:22.818 [2024-07-15 13:36:10.271411] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:22.818 [2024-07-15 13:36:10.325021] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:22.818 [2024-07-15 13:36:10.325052] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:23.383 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:23.383 13:36:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:23.383 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:23.383 13:36:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:23.641 BaseBdev1_malloc 00:13:23.641 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:23.899 true 00:13:23.899 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:23.899 [2024-07-15 13:36:11.443212] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:23.899 [2024-07-15 13:36:11.443252] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:23.899 [2024-07-15 13:36:11.443281] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x267f990 00:13:23.899 [2024-07-15 13:36:11.443290] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:23.899 [2024-07-15 13:36:11.444474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:23.899 [2024-07-15 13:36:11.444495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:23.899 BaseBdev1 00:13:23.899 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:23.899 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:24.156 BaseBdev2_malloc 00:13:24.156 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:24.414 true 00:13:24.414 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:24.414 [2024-07-15 13:36:11.968395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:24.414 [2024-07-15 13:36:11.968426] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:24.414 [2024-07-15 13:36:11.968439] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26841d0 00:13:24.414 [2024-07-15 13:36:11.968447] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:24.414 [2024-07-15 13:36:11.969399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:24.414 [2024-07-15 13:36:11.969420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:24.414 BaseBdev2 00:13:24.414 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:24.414 13:36:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:24.672 BaseBdev3_malloc 00:13:24.672 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:24.929 true 00:13:24.930 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:24.930 [2024-07-15 13:36:12.501545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:24.930 [2024-07-15 13:36:12.501580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:24.930 [2024-07-15 13:36:12.501593] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2686490 00:13:24.930 [2024-07-15 13:36:12.501617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:24.930 [2024-07-15 13:36:12.502562] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:24.930 [2024-07-15 13:36:12.502583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:24.930 BaseBdev3 00:13:24.930 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:25.187 [2024-07-15 13:36:12.674034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:25.187 [2024-07-15 13:36:12.674860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:25.187 [2024-07-15 13:36:12.674907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:25.187 [2024-07-15 13:36:12.675057] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2687b40 00:13:25.187 [2024-07-15 13:36:12.675065] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:25.187 [2024-07-15 13:36:12.675194] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26876e0 00:13:25.187 [2024-07-15 13:36:12.675287] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2687b40 00:13:25.187 [2024-07-15 13:36:12.675293] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2687b40 00:13:25.187 [2024-07-15 13:36:12.675355] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.188 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:25.445 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.445 "name": "raid_bdev1", 00:13:25.445 "uuid": "1468a97c-140e-4363-b8fb-d275758b4374", 00:13:25.445 "strip_size_kb": 64, 00:13:25.445 "state": "online", 00:13:25.445 "raid_level": "concat", 00:13:25.445 "superblock": true, 00:13:25.445 "num_base_bdevs": 3, 00:13:25.445 "num_base_bdevs_discovered": 3, 00:13:25.445 "num_base_bdevs_operational": 3, 00:13:25.445 "base_bdevs_list": [ 00:13:25.445 { 00:13:25.445 "name": "BaseBdev1", 00:13:25.445 "uuid": "16614c36-f57e-5bb7-8372-117768086f55", 00:13:25.445 "is_configured": true, 00:13:25.445 "data_offset": 2048, 00:13:25.445 "data_size": 63488 00:13:25.445 }, 00:13:25.445 { 00:13:25.445 "name": "BaseBdev2", 00:13:25.445 "uuid": "502dda42-d07d-5c70-ba4b-c76ab7c1ce98", 00:13:25.445 "is_configured": true, 00:13:25.445 "data_offset": 2048, 00:13:25.445 "data_size": 63488 00:13:25.445 }, 00:13:25.445 { 00:13:25.445 "name": "BaseBdev3", 00:13:25.445 "uuid": "31785ac3-c01b-5cd8-a910-212327794f50", 00:13:25.445 "is_configured": true, 00:13:25.445 "data_offset": 2048, 00:13:25.445 "data_size": 63488 00:13:25.445 } 00:13:25.445 ] 00:13:25.445 }' 00:13:25.445 13:36:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.445 13:36:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.013 13:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:26.013 13:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:26.013 [2024-07-15 13:36:13.428221] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24d5d90 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.946 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:27.204 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.204 "name": "raid_bdev1", 00:13:27.204 "uuid": "1468a97c-140e-4363-b8fb-d275758b4374", 00:13:27.204 "strip_size_kb": 64, 00:13:27.204 "state": "online", 00:13:27.204 "raid_level": "concat", 00:13:27.204 "superblock": true, 00:13:27.204 "num_base_bdevs": 3, 00:13:27.204 "num_base_bdevs_discovered": 3, 00:13:27.204 "num_base_bdevs_operational": 3, 00:13:27.204 "base_bdevs_list": [ 00:13:27.204 { 00:13:27.204 "name": "BaseBdev1", 00:13:27.204 "uuid": "16614c36-f57e-5bb7-8372-117768086f55", 00:13:27.204 "is_configured": true, 00:13:27.204 "data_offset": 2048, 00:13:27.204 "data_size": 63488 00:13:27.204 }, 00:13:27.204 { 00:13:27.204 "name": "BaseBdev2", 00:13:27.204 "uuid": "502dda42-d07d-5c70-ba4b-c76ab7c1ce98", 00:13:27.204 "is_configured": true, 00:13:27.204 "data_offset": 2048, 00:13:27.204 "data_size": 63488 00:13:27.204 }, 00:13:27.204 { 00:13:27.204 "name": "BaseBdev3", 00:13:27.204 "uuid": "31785ac3-c01b-5cd8-a910-212327794f50", 00:13:27.204 "is_configured": true, 00:13:27.204 "data_offset": 2048, 00:13:27.204 "data_size": 63488 00:13:27.204 } 00:13:27.204 ] 00:13:27.204 }' 00:13:27.204 13:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.204 13:36:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.781 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:27.781 [2024-07-15 13:36:15.368620] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:27.781 [2024-07-15 13:36:15.368659] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:27.781 [2024-07-15 13:36:15.370665] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:27.781 [2024-07-15 13:36:15.370691] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:27.781 [2024-07-15 13:36:15.370715] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:27.781 [2024-07-15 13:36:15.370723] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2687b40 name raid_bdev1, state offline 00:13:27.781 0 00:13:27.781 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 8275 00:13:27.781 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 8275 ']' 00:13:27.781 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 8275 00:13:27.781 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:27.781 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:27.781 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 8275 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 8275' 00:13:28.038 killing process with pid 8275 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 8275 00:13:28.038 [2024-07-15 13:36:15.420967] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 8275 00:13:28.038 [2024-07-15 13:36:15.439979] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.gZuzdEvTAN 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:28.038 13:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:13:28.038 00:13:28.038 real 0m5.620s 00:13:28.038 user 0m8.563s 00:13:28.038 sys 0m1.026s 00:13:28.039 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:28.039 13:36:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.039 ************************************ 00:13:28.039 END TEST raid_read_error_test 00:13:28.039 ************************************ 00:13:28.295 13:36:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:28.295 13:36:15 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:13:28.295 13:36:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:28.295 13:36:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:28.295 13:36:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:28.295 ************************************ 00:13:28.295 START TEST raid_write_error_test 00:13:28.295 ************************************ 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.eD8YAtcbJs 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=9089 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 9089 /var/tmp/spdk-raid.sock 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 9089 ']' 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:28.295 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:28.295 13:36:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.295 [2024-07-15 13:36:15.817429] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:13:28.295 [2024-07-15 13:36:15.817485] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid9089 ] 00:13:28.295 [2024-07-15 13:36:15.901963] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:28.552 [2024-07-15 13:36:15.992127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:28.552 [2024-07-15 13:36:16.055616] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:28.552 [2024-07-15 13:36:16.055644] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.116 13:36:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:29.116 13:36:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:29.116 13:36:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:29.116 13:36:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:29.374 BaseBdev1_malloc 00:13:29.374 13:36:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:29.374 true 00:13:29.374 13:36:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:29.631 [2024-07-15 13:36:17.121599] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:29.631 [2024-07-15 13:36:17.121635] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:29.631 [2024-07-15 13:36:17.121646] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261e990 00:13:29.631 [2024-07-15 13:36:17.121670] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:29.631 [2024-07-15 13:36:17.122840] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:29.631 [2024-07-15 13:36:17.122860] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:29.631 BaseBdev1 00:13:29.631 13:36:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:29.631 13:36:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:29.889 BaseBdev2_malloc 00:13:29.890 13:36:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:29.890 true 00:13:29.890 13:36:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:30.148 [2024-07-15 13:36:17.658712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:30.148 [2024-07-15 13:36:17.658750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.148 [2024-07-15 13:36:17.658780] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26231d0 00:13:30.148 [2024-07-15 13:36:17.658789] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.148 [2024-07-15 13:36:17.659865] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.148 [2024-07-15 13:36:17.659888] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:30.148 BaseBdev2 00:13:30.148 13:36:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:30.148 13:36:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:30.406 BaseBdev3_malloc 00:13:30.406 13:36:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:30.406 true 00:13:30.664 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:30.664 [2024-07-15 13:36:18.187853] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:30.664 [2024-07-15 13:36:18.187895] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.664 [2024-07-15 13:36:18.187908] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2625490 00:13:30.664 [2024-07-15 13:36:18.187933] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.664 [2024-07-15 13:36:18.188906] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.664 [2024-07-15 13:36:18.188927] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:30.664 BaseBdev3 00:13:30.664 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:30.922 [2024-07-15 13:36:18.360330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:30.922 [2024-07-15 13:36:18.361144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:30.922 [2024-07-15 13:36:18.361191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:30.922 [2024-07-15 13:36:18.361332] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2626b40 00:13:30.922 [2024-07-15 13:36:18.361340] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:30.922 [2024-07-15 13:36:18.361471] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26266e0 00:13:30.922 [2024-07-15 13:36:18.361570] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2626b40 00:13:30.922 [2024-07-15 13:36:18.361577] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2626b40 00:13:30.922 [2024-07-15 13:36:18.361643] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.922 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.923 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.923 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.923 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:30.923 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.181 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.181 "name": "raid_bdev1", 00:13:31.181 "uuid": "03e60d56-f433-45ca-ada8-2c2624ebc818", 00:13:31.181 "strip_size_kb": 64, 00:13:31.181 "state": "online", 00:13:31.181 "raid_level": "concat", 00:13:31.181 "superblock": true, 00:13:31.181 "num_base_bdevs": 3, 00:13:31.181 "num_base_bdevs_discovered": 3, 00:13:31.181 "num_base_bdevs_operational": 3, 00:13:31.181 "base_bdevs_list": [ 00:13:31.181 { 00:13:31.181 "name": "BaseBdev1", 00:13:31.181 "uuid": "604defb3-1ee2-5585-b683-6162819f3ec4", 00:13:31.181 "is_configured": true, 00:13:31.181 "data_offset": 2048, 00:13:31.181 "data_size": 63488 00:13:31.181 }, 00:13:31.181 { 00:13:31.181 "name": "BaseBdev2", 00:13:31.181 "uuid": "64485d26-3bdb-508c-8b52-788ffb0a9fcf", 00:13:31.181 "is_configured": true, 00:13:31.181 "data_offset": 2048, 00:13:31.181 "data_size": 63488 00:13:31.181 }, 00:13:31.181 { 00:13:31.181 "name": "BaseBdev3", 00:13:31.181 "uuid": "234266c1-8911-568f-bb60-a48072e25ec1", 00:13:31.181 "is_configured": true, 00:13:31.181 "data_offset": 2048, 00:13:31.181 "data_size": 63488 00:13:31.181 } 00:13:31.181 ] 00:13:31.181 }' 00:13:31.181 13:36:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.181 13:36:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.748 13:36:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:31.748 13:36:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:31.748 [2024-07-15 13:36:19.146598] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2474d90 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.759 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:33.017 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.017 "name": "raid_bdev1", 00:13:33.017 "uuid": "03e60d56-f433-45ca-ada8-2c2624ebc818", 00:13:33.017 "strip_size_kb": 64, 00:13:33.017 "state": "online", 00:13:33.017 "raid_level": "concat", 00:13:33.017 "superblock": true, 00:13:33.017 "num_base_bdevs": 3, 00:13:33.017 "num_base_bdevs_discovered": 3, 00:13:33.017 "num_base_bdevs_operational": 3, 00:13:33.017 "base_bdevs_list": [ 00:13:33.017 { 00:13:33.017 "name": "BaseBdev1", 00:13:33.017 "uuid": "604defb3-1ee2-5585-b683-6162819f3ec4", 00:13:33.017 "is_configured": true, 00:13:33.017 "data_offset": 2048, 00:13:33.017 "data_size": 63488 00:13:33.017 }, 00:13:33.017 { 00:13:33.017 "name": "BaseBdev2", 00:13:33.017 "uuid": "64485d26-3bdb-508c-8b52-788ffb0a9fcf", 00:13:33.017 "is_configured": true, 00:13:33.017 "data_offset": 2048, 00:13:33.017 "data_size": 63488 00:13:33.017 }, 00:13:33.017 { 00:13:33.017 "name": "BaseBdev3", 00:13:33.017 "uuid": "234266c1-8911-568f-bb60-a48072e25ec1", 00:13:33.017 "is_configured": true, 00:13:33.017 "data_offset": 2048, 00:13:33.017 "data_size": 63488 00:13:33.017 } 00:13:33.017 ] 00:13:33.017 }' 00:13:33.017 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.017 13:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.584 13:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:33.584 [2024-07-15 13:36:21.083653] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:33.584 [2024-07-15 13:36:21.083695] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:33.584 [2024-07-15 13:36:21.085700] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:33.584 [2024-07-15 13:36:21.085727] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:33.584 [2024-07-15 13:36:21.085751] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:33.584 [2024-07-15 13:36:21.085763] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2626b40 name raid_bdev1, state offline 00:13:33.584 0 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 9089 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 9089 ']' 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 9089 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 9089 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 9089' 00:13:33.584 killing process with pid 9089 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 9089 00:13:33.584 [2024-07-15 13:36:21.150645] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:33.584 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 9089 00:13:33.584 [2024-07-15 13:36:21.170775] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.eD8YAtcbJs 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:13:33.842 00:13:33.842 real 0m5.645s 00:13:33.842 user 0m8.575s 00:13:33.842 sys 0m1.040s 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:33.842 13:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.842 ************************************ 00:13:33.842 END TEST raid_write_error_test 00:13:33.842 ************************************ 00:13:33.842 13:36:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:33.842 13:36:21 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:33.842 13:36:21 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:13:33.842 13:36:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:33.842 13:36:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:33.842 13:36:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:34.100 ************************************ 00:13:34.100 START TEST raid_state_function_test 00:13:34.100 ************************************ 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:34.100 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=10011 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 10011' 00:13:34.101 Process raid pid: 10011 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 10011 /var/tmp/spdk-raid.sock 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 10011 ']' 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:34.101 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:34.101 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.101 [2024-07-15 13:36:21.540958] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:13:34.101 [2024-07-15 13:36:21.541031] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:34.101 [2024-07-15 13:36:21.626747] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:34.101 [2024-07-15 13:36:21.707671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:34.359 [2024-07-15 13:36:21.766396] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:34.359 [2024-07-15 13:36:21.766431] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:34.928 [2024-07-15 13:36:22.493478] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:34.928 [2024-07-15 13:36:22.493515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:34.928 [2024-07-15 13:36:22.493522] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:34.928 [2024-07-15 13:36:22.493545] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:34.928 [2024-07-15 13:36:22.493551] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:34.928 [2024-07-15 13:36:22.493559] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.928 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.188 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.188 "name": "Existed_Raid", 00:13:35.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.188 "strip_size_kb": 0, 00:13:35.188 "state": "configuring", 00:13:35.188 "raid_level": "raid1", 00:13:35.188 "superblock": false, 00:13:35.188 "num_base_bdevs": 3, 00:13:35.188 "num_base_bdevs_discovered": 0, 00:13:35.188 "num_base_bdevs_operational": 3, 00:13:35.188 "base_bdevs_list": [ 00:13:35.188 { 00:13:35.188 "name": "BaseBdev1", 00:13:35.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.188 "is_configured": false, 00:13:35.188 "data_offset": 0, 00:13:35.188 "data_size": 0 00:13:35.188 }, 00:13:35.188 { 00:13:35.188 "name": "BaseBdev2", 00:13:35.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.188 "is_configured": false, 00:13:35.188 "data_offset": 0, 00:13:35.188 "data_size": 0 00:13:35.188 }, 00:13:35.188 { 00:13:35.188 "name": "BaseBdev3", 00:13:35.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.188 "is_configured": false, 00:13:35.188 "data_offset": 0, 00:13:35.188 "data_size": 0 00:13:35.188 } 00:13:35.188 ] 00:13:35.188 }' 00:13:35.188 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.188 13:36:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.758 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:35.758 [2024-07-15 13:36:23.339570] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:35.758 [2024-07-15 13:36:23.339592] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x210cf50 name Existed_Raid, state configuring 00:13:35.758 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:36.015 [2024-07-15 13:36:23.520056] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:36.015 [2024-07-15 13:36:23.520081] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:36.015 [2024-07-15 13:36:23.520091] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:36.015 [2024-07-15 13:36:23.520099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:36.016 [2024-07-15 13:36:23.520105] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:36.016 [2024-07-15 13:36:23.520112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:36.016 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:36.274 [2024-07-15 13:36:23.693105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:36.274 BaseBdev1 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:36.274 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:36.533 [ 00:13:36.533 { 00:13:36.533 "name": "BaseBdev1", 00:13:36.533 "aliases": [ 00:13:36.533 "25c7f350-cc85-4bd3-90ca-f4c285507e8f" 00:13:36.533 ], 00:13:36.533 "product_name": "Malloc disk", 00:13:36.533 "block_size": 512, 00:13:36.533 "num_blocks": 65536, 00:13:36.533 "uuid": "25c7f350-cc85-4bd3-90ca-f4c285507e8f", 00:13:36.533 "assigned_rate_limits": { 00:13:36.533 "rw_ios_per_sec": 0, 00:13:36.533 "rw_mbytes_per_sec": 0, 00:13:36.533 "r_mbytes_per_sec": 0, 00:13:36.533 "w_mbytes_per_sec": 0 00:13:36.533 }, 00:13:36.533 "claimed": true, 00:13:36.533 "claim_type": "exclusive_write", 00:13:36.533 "zoned": false, 00:13:36.533 "supported_io_types": { 00:13:36.533 "read": true, 00:13:36.533 "write": true, 00:13:36.533 "unmap": true, 00:13:36.533 "flush": true, 00:13:36.533 "reset": true, 00:13:36.533 "nvme_admin": false, 00:13:36.533 "nvme_io": false, 00:13:36.533 "nvme_io_md": false, 00:13:36.533 "write_zeroes": true, 00:13:36.533 "zcopy": true, 00:13:36.533 "get_zone_info": false, 00:13:36.533 "zone_management": false, 00:13:36.533 "zone_append": false, 00:13:36.533 "compare": false, 00:13:36.533 "compare_and_write": false, 00:13:36.533 "abort": true, 00:13:36.533 "seek_hole": false, 00:13:36.533 "seek_data": false, 00:13:36.533 "copy": true, 00:13:36.533 "nvme_iov_md": false 00:13:36.533 }, 00:13:36.533 "memory_domains": [ 00:13:36.533 { 00:13:36.533 "dma_device_id": "system", 00:13:36.533 "dma_device_type": 1 00:13:36.533 }, 00:13:36.533 { 00:13:36.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.533 "dma_device_type": 2 00:13:36.533 } 00:13:36.533 ], 00:13:36.533 "driver_specific": {} 00:13:36.533 } 00:13:36.533 ] 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:36.533 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.792 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.792 "name": "Existed_Raid", 00:13:36.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.792 "strip_size_kb": 0, 00:13:36.792 "state": "configuring", 00:13:36.792 "raid_level": "raid1", 00:13:36.792 "superblock": false, 00:13:36.792 "num_base_bdevs": 3, 00:13:36.792 "num_base_bdevs_discovered": 1, 00:13:36.792 "num_base_bdevs_operational": 3, 00:13:36.792 "base_bdevs_list": [ 00:13:36.792 { 00:13:36.792 "name": "BaseBdev1", 00:13:36.792 "uuid": "25c7f350-cc85-4bd3-90ca-f4c285507e8f", 00:13:36.792 "is_configured": true, 00:13:36.792 "data_offset": 0, 00:13:36.792 "data_size": 65536 00:13:36.792 }, 00:13:36.792 { 00:13:36.792 "name": "BaseBdev2", 00:13:36.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.792 "is_configured": false, 00:13:36.792 "data_offset": 0, 00:13:36.792 "data_size": 0 00:13:36.792 }, 00:13:36.792 { 00:13:36.792 "name": "BaseBdev3", 00:13:36.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.792 "is_configured": false, 00:13:36.792 "data_offset": 0, 00:13:36.792 "data_size": 0 00:13:36.792 } 00:13:36.792 ] 00:13:36.792 }' 00:13:36.792 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.792 13:36:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.360 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:37.360 [2024-07-15 13:36:24.900210] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:37.360 [2024-07-15 13:36:24.900245] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x210c820 name Existed_Raid, state configuring 00:13:37.360 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:37.620 [2024-07-15 13:36:25.076685] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:37.620 [2024-07-15 13:36:25.077726] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:37.620 [2024-07-15 13:36:25.077753] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:37.620 [2024-07-15 13:36:25.077759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:37.620 [2024-07-15 13:36:25.077767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.620 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.879 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.879 "name": "Existed_Raid", 00:13:37.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.879 "strip_size_kb": 0, 00:13:37.879 "state": "configuring", 00:13:37.879 "raid_level": "raid1", 00:13:37.879 "superblock": false, 00:13:37.879 "num_base_bdevs": 3, 00:13:37.879 "num_base_bdevs_discovered": 1, 00:13:37.879 "num_base_bdevs_operational": 3, 00:13:37.879 "base_bdevs_list": [ 00:13:37.879 { 00:13:37.879 "name": "BaseBdev1", 00:13:37.879 "uuid": "25c7f350-cc85-4bd3-90ca-f4c285507e8f", 00:13:37.879 "is_configured": true, 00:13:37.879 "data_offset": 0, 00:13:37.879 "data_size": 65536 00:13:37.879 }, 00:13:37.879 { 00:13:37.879 "name": "BaseBdev2", 00:13:37.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.879 "is_configured": false, 00:13:37.879 "data_offset": 0, 00:13:37.879 "data_size": 0 00:13:37.879 }, 00:13:37.879 { 00:13:37.879 "name": "BaseBdev3", 00:13:37.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.879 "is_configured": false, 00:13:37.879 "data_offset": 0, 00:13:37.879 "data_size": 0 00:13:37.879 } 00:13:37.879 ] 00:13:37.879 }' 00:13:37.879 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.879 13:36:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.139 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:38.397 [2024-07-15 13:36:25.913674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:38.397 BaseBdev2 00:13:38.397 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:38.397 13:36:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:38.397 13:36:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:38.397 13:36:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:38.397 13:36:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:38.397 13:36:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:38.397 13:36:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:38.655 13:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:38.655 [ 00:13:38.655 { 00:13:38.655 "name": "BaseBdev2", 00:13:38.655 "aliases": [ 00:13:38.655 "6f1c6c38-0393-4a51-b667-f016487a3f1e" 00:13:38.655 ], 00:13:38.655 "product_name": "Malloc disk", 00:13:38.655 "block_size": 512, 00:13:38.655 "num_blocks": 65536, 00:13:38.655 "uuid": "6f1c6c38-0393-4a51-b667-f016487a3f1e", 00:13:38.655 "assigned_rate_limits": { 00:13:38.655 "rw_ios_per_sec": 0, 00:13:38.655 "rw_mbytes_per_sec": 0, 00:13:38.655 "r_mbytes_per_sec": 0, 00:13:38.655 "w_mbytes_per_sec": 0 00:13:38.655 }, 00:13:38.655 "claimed": true, 00:13:38.655 "claim_type": "exclusive_write", 00:13:38.655 "zoned": false, 00:13:38.655 "supported_io_types": { 00:13:38.655 "read": true, 00:13:38.655 "write": true, 00:13:38.655 "unmap": true, 00:13:38.655 "flush": true, 00:13:38.655 "reset": true, 00:13:38.655 "nvme_admin": false, 00:13:38.655 "nvme_io": false, 00:13:38.655 "nvme_io_md": false, 00:13:38.655 "write_zeroes": true, 00:13:38.655 "zcopy": true, 00:13:38.655 "get_zone_info": false, 00:13:38.655 "zone_management": false, 00:13:38.655 "zone_append": false, 00:13:38.655 "compare": false, 00:13:38.655 "compare_and_write": false, 00:13:38.655 "abort": true, 00:13:38.655 "seek_hole": false, 00:13:38.655 "seek_data": false, 00:13:38.655 "copy": true, 00:13:38.655 "nvme_iov_md": false 00:13:38.655 }, 00:13:38.655 "memory_domains": [ 00:13:38.655 { 00:13:38.655 "dma_device_id": "system", 00:13:38.655 "dma_device_type": 1 00:13:38.655 }, 00:13:38.655 { 00:13:38.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.655 "dma_device_type": 2 00:13:38.655 } 00:13:38.655 ], 00:13:38.655 "driver_specific": {} 00:13:38.655 } 00:13:38.655 ] 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.913 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:38.914 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.914 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.914 "name": "Existed_Raid", 00:13:38.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.914 "strip_size_kb": 0, 00:13:38.914 "state": "configuring", 00:13:38.914 "raid_level": "raid1", 00:13:38.914 "superblock": false, 00:13:38.914 "num_base_bdevs": 3, 00:13:38.914 "num_base_bdevs_discovered": 2, 00:13:38.914 "num_base_bdevs_operational": 3, 00:13:38.914 "base_bdevs_list": [ 00:13:38.914 { 00:13:38.914 "name": "BaseBdev1", 00:13:38.914 "uuid": "25c7f350-cc85-4bd3-90ca-f4c285507e8f", 00:13:38.914 "is_configured": true, 00:13:38.914 "data_offset": 0, 00:13:38.914 "data_size": 65536 00:13:38.914 }, 00:13:38.914 { 00:13:38.914 "name": "BaseBdev2", 00:13:38.914 "uuid": "6f1c6c38-0393-4a51-b667-f016487a3f1e", 00:13:38.914 "is_configured": true, 00:13:38.914 "data_offset": 0, 00:13:38.914 "data_size": 65536 00:13:38.914 }, 00:13:38.914 { 00:13:38.914 "name": "BaseBdev3", 00:13:38.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.914 "is_configured": false, 00:13:38.914 "data_offset": 0, 00:13:38.914 "data_size": 0 00:13:38.914 } 00:13:38.914 ] 00:13:38.914 }' 00:13:38.914 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.914 13:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.479 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:39.737 [2024-07-15 13:36:27.108865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:39.737 [2024-07-15 13:36:27.108910] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x210d710 00:13:39.737 [2024-07-15 13:36:27.108916] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:39.737 [2024-07-15 13:36:27.109063] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x210d3e0 00:13:39.737 [2024-07-15 13:36:27.109154] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x210d710 00:13:39.737 [2024-07-15 13:36:27.109161] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x210d710 00:13:39.737 [2024-07-15 13:36:27.109289] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:39.737 BaseBdev3 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:39.737 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:39.996 [ 00:13:39.996 { 00:13:39.996 "name": "BaseBdev3", 00:13:39.996 "aliases": [ 00:13:39.996 "1a31294e-18db-4f5e-9963-9f6dc178f6a4" 00:13:39.996 ], 00:13:39.996 "product_name": "Malloc disk", 00:13:39.996 "block_size": 512, 00:13:39.996 "num_blocks": 65536, 00:13:39.996 "uuid": "1a31294e-18db-4f5e-9963-9f6dc178f6a4", 00:13:39.996 "assigned_rate_limits": { 00:13:39.996 "rw_ios_per_sec": 0, 00:13:39.996 "rw_mbytes_per_sec": 0, 00:13:39.996 "r_mbytes_per_sec": 0, 00:13:39.996 "w_mbytes_per_sec": 0 00:13:39.996 }, 00:13:39.996 "claimed": true, 00:13:39.996 "claim_type": "exclusive_write", 00:13:39.997 "zoned": false, 00:13:39.997 "supported_io_types": { 00:13:39.997 "read": true, 00:13:39.997 "write": true, 00:13:39.997 "unmap": true, 00:13:39.997 "flush": true, 00:13:39.997 "reset": true, 00:13:39.997 "nvme_admin": false, 00:13:39.997 "nvme_io": false, 00:13:39.997 "nvme_io_md": false, 00:13:39.997 "write_zeroes": true, 00:13:39.997 "zcopy": true, 00:13:39.997 "get_zone_info": false, 00:13:39.997 "zone_management": false, 00:13:39.997 "zone_append": false, 00:13:39.997 "compare": false, 00:13:39.997 "compare_and_write": false, 00:13:39.997 "abort": true, 00:13:39.997 "seek_hole": false, 00:13:39.997 "seek_data": false, 00:13:39.997 "copy": true, 00:13:39.997 "nvme_iov_md": false 00:13:39.997 }, 00:13:39.997 "memory_domains": [ 00:13:39.997 { 00:13:39.997 "dma_device_id": "system", 00:13:39.997 "dma_device_type": 1 00:13:39.997 }, 00:13:39.997 { 00:13:39.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.997 "dma_device_type": 2 00:13:39.997 } 00:13:39.997 ], 00:13:39.997 "driver_specific": {} 00:13:39.997 } 00:13:39.997 ] 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.997 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.256 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.256 "name": "Existed_Raid", 00:13:40.256 "uuid": "987bfde0-5280-4dd9-879e-8b03218aa58c", 00:13:40.256 "strip_size_kb": 0, 00:13:40.256 "state": "online", 00:13:40.256 "raid_level": "raid1", 00:13:40.256 "superblock": false, 00:13:40.256 "num_base_bdevs": 3, 00:13:40.256 "num_base_bdevs_discovered": 3, 00:13:40.256 "num_base_bdevs_operational": 3, 00:13:40.256 "base_bdevs_list": [ 00:13:40.256 { 00:13:40.256 "name": "BaseBdev1", 00:13:40.256 "uuid": "25c7f350-cc85-4bd3-90ca-f4c285507e8f", 00:13:40.256 "is_configured": true, 00:13:40.256 "data_offset": 0, 00:13:40.256 "data_size": 65536 00:13:40.256 }, 00:13:40.256 { 00:13:40.256 "name": "BaseBdev2", 00:13:40.256 "uuid": "6f1c6c38-0393-4a51-b667-f016487a3f1e", 00:13:40.256 "is_configured": true, 00:13:40.256 "data_offset": 0, 00:13:40.256 "data_size": 65536 00:13:40.256 }, 00:13:40.256 { 00:13:40.256 "name": "BaseBdev3", 00:13:40.256 "uuid": "1a31294e-18db-4f5e-9963-9f6dc178f6a4", 00:13:40.256 "is_configured": true, 00:13:40.256 "data_offset": 0, 00:13:40.256 "data_size": 65536 00:13:40.256 } 00:13:40.256 ] 00:13:40.256 }' 00:13:40.256 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.256 13:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:40.824 [2024-07-15 13:36:28.300146] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:40.824 "name": "Existed_Raid", 00:13:40.824 "aliases": [ 00:13:40.824 "987bfde0-5280-4dd9-879e-8b03218aa58c" 00:13:40.824 ], 00:13:40.824 "product_name": "Raid Volume", 00:13:40.824 "block_size": 512, 00:13:40.824 "num_blocks": 65536, 00:13:40.824 "uuid": "987bfde0-5280-4dd9-879e-8b03218aa58c", 00:13:40.824 "assigned_rate_limits": { 00:13:40.824 "rw_ios_per_sec": 0, 00:13:40.824 "rw_mbytes_per_sec": 0, 00:13:40.824 "r_mbytes_per_sec": 0, 00:13:40.824 "w_mbytes_per_sec": 0 00:13:40.824 }, 00:13:40.824 "claimed": false, 00:13:40.824 "zoned": false, 00:13:40.824 "supported_io_types": { 00:13:40.824 "read": true, 00:13:40.824 "write": true, 00:13:40.824 "unmap": false, 00:13:40.824 "flush": false, 00:13:40.824 "reset": true, 00:13:40.824 "nvme_admin": false, 00:13:40.824 "nvme_io": false, 00:13:40.824 "nvme_io_md": false, 00:13:40.824 "write_zeroes": true, 00:13:40.824 "zcopy": false, 00:13:40.824 "get_zone_info": false, 00:13:40.824 "zone_management": false, 00:13:40.824 "zone_append": false, 00:13:40.824 "compare": false, 00:13:40.824 "compare_and_write": false, 00:13:40.824 "abort": false, 00:13:40.824 "seek_hole": false, 00:13:40.824 "seek_data": false, 00:13:40.824 "copy": false, 00:13:40.824 "nvme_iov_md": false 00:13:40.824 }, 00:13:40.824 "memory_domains": [ 00:13:40.824 { 00:13:40.824 "dma_device_id": "system", 00:13:40.824 "dma_device_type": 1 00:13:40.824 }, 00:13:40.824 { 00:13:40.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.824 "dma_device_type": 2 00:13:40.824 }, 00:13:40.824 { 00:13:40.824 "dma_device_id": "system", 00:13:40.824 "dma_device_type": 1 00:13:40.824 }, 00:13:40.824 { 00:13:40.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.824 "dma_device_type": 2 00:13:40.824 }, 00:13:40.824 { 00:13:40.824 "dma_device_id": "system", 00:13:40.824 "dma_device_type": 1 00:13:40.824 }, 00:13:40.824 { 00:13:40.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.824 "dma_device_type": 2 00:13:40.824 } 00:13:40.824 ], 00:13:40.824 "driver_specific": { 00:13:40.824 "raid": { 00:13:40.824 "uuid": "987bfde0-5280-4dd9-879e-8b03218aa58c", 00:13:40.824 "strip_size_kb": 0, 00:13:40.824 "state": "online", 00:13:40.824 "raid_level": "raid1", 00:13:40.824 "superblock": false, 00:13:40.824 "num_base_bdevs": 3, 00:13:40.824 "num_base_bdevs_discovered": 3, 00:13:40.824 "num_base_bdevs_operational": 3, 00:13:40.824 "base_bdevs_list": [ 00:13:40.824 { 00:13:40.824 "name": "BaseBdev1", 00:13:40.824 "uuid": "25c7f350-cc85-4bd3-90ca-f4c285507e8f", 00:13:40.824 "is_configured": true, 00:13:40.824 "data_offset": 0, 00:13:40.824 "data_size": 65536 00:13:40.824 }, 00:13:40.824 { 00:13:40.824 "name": "BaseBdev2", 00:13:40.824 "uuid": "6f1c6c38-0393-4a51-b667-f016487a3f1e", 00:13:40.824 "is_configured": true, 00:13:40.824 "data_offset": 0, 00:13:40.824 "data_size": 65536 00:13:40.824 }, 00:13:40.824 { 00:13:40.824 "name": "BaseBdev3", 00:13:40.824 "uuid": "1a31294e-18db-4f5e-9963-9f6dc178f6a4", 00:13:40.824 "is_configured": true, 00:13:40.824 "data_offset": 0, 00:13:40.824 "data_size": 65536 00:13:40.824 } 00:13:40.824 ] 00:13:40.824 } 00:13:40.824 } 00:13:40.824 }' 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:40.824 BaseBdev2 00:13:40.824 BaseBdev3' 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:40.824 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:41.084 "name": "BaseBdev1", 00:13:41.084 "aliases": [ 00:13:41.084 "25c7f350-cc85-4bd3-90ca-f4c285507e8f" 00:13:41.084 ], 00:13:41.084 "product_name": "Malloc disk", 00:13:41.084 "block_size": 512, 00:13:41.084 "num_blocks": 65536, 00:13:41.084 "uuid": "25c7f350-cc85-4bd3-90ca-f4c285507e8f", 00:13:41.084 "assigned_rate_limits": { 00:13:41.084 "rw_ios_per_sec": 0, 00:13:41.084 "rw_mbytes_per_sec": 0, 00:13:41.084 "r_mbytes_per_sec": 0, 00:13:41.084 "w_mbytes_per_sec": 0 00:13:41.084 }, 00:13:41.084 "claimed": true, 00:13:41.084 "claim_type": "exclusive_write", 00:13:41.084 "zoned": false, 00:13:41.084 "supported_io_types": { 00:13:41.084 "read": true, 00:13:41.084 "write": true, 00:13:41.084 "unmap": true, 00:13:41.084 "flush": true, 00:13:41.084 "reset": true, 00:13:41.084 "nvme_admin": false, 00:13:41.084 "nvme_io": false, 00:13:41.084 "nvme_io_md": false, 00:13:41.084 "write_zeroes": true, 00:13:41.084 "zcopy": true, 00:13:41.084 "get_zone_info": false, 00:13:41.084 "zone_management": false, 00:13:41.084 "zone_append": false, 00:13:41.084 "compare": false, 00:13:41.084 "compare_and_write": false, 00:13:41.084 "abort": true, 00:13:41.084 "seek_hole": false, 00:13:41.084 "seek_data": false, 00:13:41.084 "copy": true, 00:13:41.084 "nvme_iov_md": false 00:13:41.084 }, 00:13:41.084 "memory_domains": [ 00:13:41.084 { 00:13:41.084 "dma_device_id": "system", 00:13:41.084 "dma_device_type": 1 00:13:41.084 }, 00:13:41.084 { 00:13:41.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.084 "dma_device_type": 2 00:13:41.084 } 00:13:41.084 ], 00:13:41.084 "driver_specific": {} 00:13:41.084 }' 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:41.084 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:41.343 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:41.600 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:41.600 "name": "BaseBdev2", 00:13:41.600 "aliases": [ 00:13:41.600 "6f1c6c38-0393-4a51-b667-f016487a3f1e" 00:13:41.600 ], 00:13:41.600 "product_name": "Malloc disk", 00:13:41.600 "block_size": 512, 00:13:41.600 "num_blocks": 65536, 00:13:41.600 "uuid": "6f1c6c38-0393-4a51-b667-f016487a3f1e", 00:13:41.600 "assigned_rate_limits": { 00:13:41.600 "rw_ios_per_sec": 0, 00:13:41.600 "rw_mbytes_per_sec": 0, 00:13:41.600 "r_mbytes_per_sec": 0, 00:13:41.601 "w_mbytes_per_sec": 0 00:13:41.601 }, 00:13:41.601 "claimed": true, 00:13:41.601 "claim_type": "exclusive_write", 00:13:41.601 "zoned": false, 00:13:41.601 "supported_io_types": { 00:13:41.601 "read": true, 00:13:41.601 "write": true, 00:13:41.601 "unmap": true, 00:13:41.601 "flush": true, 00:13:41.601 "reset": true, 00:13:41.601 "nvme_admin": false, 00:13:41.601 "nvme_io": false, 00:13:41.601 "nvme_io_md": false, 00:13:41.601 "write_zeroes": true, 00:13:41.601 "zcopy": true, 00:13:41.601 "get_zone_info": false, 00:13:41.601 "zone_management": false, 00:13:41.601 "zone_append": false, 00:13:41.601 "compare": false, 00:13:41.601 "compare_and_write": false, 00:13:41.601 "abort": true, 00:13:41.601 "seek_hole": false, 00:13:41.601 "seek_data": false, 00:13:41.601 "copy": true, 00:13:41.601 "nvme_iov_md": false 00:13:41.601 }, 00:13:41.601 "memory_domains": [ 00:13:41.601 { 00:13:41.601 "dma_device_id": "system", 00:13:41.601 "dma_device_type": 1 00:13:41.601 }, 00:13:41.601 { 00:13:41.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.601 "dma_device_type": 2 00:13:41.601 } 00:13:41.601 ], 00:13:41.601 "driver_specific": {} 00:13:41.601 }' 00:13:41.601 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.601 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.860 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.860 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.860 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:41.860 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:41.860 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:41.860 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:41.860 "name": "BaseBdev3", 00:13:41.860 "aliases": [ 00:13:41.860 "1a31294e-18db-4f5e-9963-9f6dc178f6a4" 00:13:41.860 ], 00:13:41.860 "product_name": "Malloc disk", 00:13:41.860 "block_size": 512, 00:13:41.860 "num_blocks": 65536, 00:13:41.860 "uuid": "1a31294e-18db-4f5e-9963-9f6dc178f6a4", 00:13:41.860 "assigned_rate_limits": { 00:13:41.860 "rw_ios_per_sec": 0, 00:13:41.860 "rw_mbytes_per_sec": 0, 00:13:41.860 "r_mbytes_per_sec": 0, 00:13:41.860 "w_mbytes_per_sec": 0 00:13:41.860 }, 00:13:41.860 "claimed": true, 00:13:41.860 "claim_type": "exclusive_write", 00:13:41.860 "zoned": false, 00:13:41.860 "supported_io_types": { 00:13:41.860 "read": true, 00:13:41.860 "write": true, 00:13:41.860 "unmap": true, 00:13:41.860 "flush": true, 00:13:41.860 "reset": true, 00:13:41.860 "nvme_admin": false, 00:13:41.860 "nvme_io": false, 00:13:41.860 "nvme_io_md": false, 00:13:41.860 "write_zeroes": true, 00:13:41.860 "zcopy": true, 00:13:41.860 "get_zone_info": false, 00:13:41.860 "zone_management": false, 00:13:41.860 "zone_append": false, 00:13:41.860 "compare": false, 00:13:41.860 "compare_and_write": false, 00:13:41.860 "abort": true, 00:13:41.860 "seek_hole": false, 00:13:41.860 "seek_data": false, 00:13:41.860 "copy": true, 00:13:41.860 "nvme_iov_md": false 00:13:41.860 }, 00:13:41.860 "memory_domains": [ 00:13:41.860 { 00:13:41.860 "dma_device_id": "system", 00:13:41.860 "dma_device_type": 1 00:13:41.860 }, 00:13:41.860 { 00:13:41.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.860 "dma_device_type": 2 00:13:41.860 } 00:13:41.860 ], 00:13:41.860 "driver_specific": {} 00:13:41.860 }' 00:13:41.860 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:42.118 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:42.378 [2024-07-15 13:36:29.892100] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.378 13:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.636 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.636 "name": "Existed_Raid", 00:13:42.636 "uuid": "987bfde0-5280-4dd9-879e-8b03218aa58c", 00:13:42.636 "strip_size_kb": 0, 00:13:42.636 "state": "online", 00:13:42.636 "raid_level": "raid1", 00:13:42.636 "superblock": false, 00:13:42.636 "num_base_bdevs": 3, 00:13:42.636 "num_base_bdevs_discovered": 2, 00:13:42.636 "num_base_bdevs_operational": 2, 00:13:42.636 "base_bdevs_list": [ 00:13:42.636 { 00:13:42.636 "name": null, 00:13:42.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.636 "is_configured": false, 00:13:42.636 "data_offset": 0, 00:13:42.636 "data_size": 65536 00:13:42.636 }, 00:13:42.636 { 00:13:42.636 "name": "BaseBdev2", 00:13:42.636 "uuid": "6f1c6c38-0393-4a51-b667-f016487a3f1e", 00:13:42.636 "is_configured": true, 00:13:42.636 "data_offset": 0, 00:13:42.636 "data_size": 65536 00:13:42.636 }, 00:13:42.637 { 00:13:42.637 "name": "BaseBdev3", 00:13:42.637 "uuid": "1a31294e-18db-4f5e-9963-9f6dc178f6a4", 00:13:42.637 "is_configured": true, 00:13:42.637 "data_offset": 0, 00:13:42.637 "data_size": 65536 00:13:42.637 } 00:13:42.637 ] 00:13:42.637 }' 00:13:42.637 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.637 13:36:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.204 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:43.204 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:43.204 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.204 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:43.204 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:43.204 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:43.204 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:43.462 [2024-07-15 13:36:30.967979] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:43.462 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:43.462 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:43.462 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.462 13:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:43.720 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:43.720 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:43.720 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:43.720 [2024-07-15 13:36:31.331624] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:43.720 [2024-07-15 13:36:31.331690] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:43.979 [2024-07-15 13:36:31.341948] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:43.979 [2024-07-15 13:36:31.341974] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:43.979 [2024-07-15 13:36:31.341982] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x210d710 name Existed_Raid, state offline 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:43.979 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:44.238 BaseBdev2 00:13:44.238 13:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:44.238 13:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:44.238 13:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:44.238 13:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:44.238 13:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:44.238 13:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:44.238 13:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:44.497 13:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:44.497 [ 00:13:44.497 { 00:13:44.497 "name": "BaseBdev2", 00:13:44.497 "aliases": [ 00:13:44.497 "b68b8849-349f-44b2-a628-9d063abbd21e" 00:13:44.497 ], 00:13:44.497 "product_name": "Malloc disk", 00:13:44.497 "block_size": 512, 00:13:44.497 "num_blocks": 65536, 00:13:44.497 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:44.497 "assigned_rate_limits": { 00:13:44.497 "rw_ios_per_sec": 0, 00:13:44.497 "rw_mbytes_per_sec": 0, 00:13:44.497 "r_mbytes_per_sec": 0, 00:13:44.497 "w_mbytes_per_sec": 0 00:13:44.497 }, 00:13:44.497 "claimed": false, 00:13:44.497 "zoned": false, 00:13:44.497 "supported_io_types": { 00:13:44.497 "read": true, 00:13:44.497 "write": true, 00:13:44.497 "unmap": true, 00:13:44.497 "flush": true, 00:13:44.497 "reset": true, 00:13:44.497 "nvme_admin": false, 00:13:44.497 "nvme_io": false, 00:13:44.497 "nvme_io_md": false, 00:13:44.497 "write_zeroes": true, 00:13:44.497 "zcopy": true, 00:13:44.497 "get_zone_info": false, 00:13:44.497 "zone_management": false, 00:13:44.497 "zone_append": false, 00:13:44.497 "compare": false, 00:13:44.497 "compare_and_write": false, 00:13:44.497 "abort": true, 00:13:44.497 "seek_hole": false, 00:13:44.497 "seek_data": false, 00:13:44.497 "copy": true, 00:13:44.497 "nvme_iov_md": false 00:13:44.497 }, 00:13:44.497 "memory_domains": [ 00:13:44.497 { 00:13:44.497 "dma_device_id": "system", 00:13:44.497 "dma_device_type": 1 00:13:44.498 }, 00:13:44.498 { 00:13:44.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.498 "dma_device_type": 2 00:13:44.498 } 00:13:44.498 ], 00:13:44.498 "driver_specific": {} 00:13:44.498 } 00:13:44.498 ] 00:13:44.498 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:44.498 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:44.498 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:44.498 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:44.757 BaseBdev3 00:13:44.757 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:44.757 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:44.757 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:44.757 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:44.757 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:44.757 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:44.757 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:45.016 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:45.016 [ 00:13:45.016 { 00:13:45.016 "name": "BaseBdev3", 00:13:45.016 "aliases": [ 00:13:45.016 "ac00f476-5490-4711-bb44-1c665137b475" 00:13:45.016 ], 00:13:45.016 "product_name": "Malloc disk", 00:13:45.016 "block_size": 512, 00:13:45.016 "num_blocks": 65536, 00:13:45.016 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:45.016 "assigned_rate_limits": { 00:13:45.016 "rw_ios_per_sec": 0, 00:13:45.016 "rw_mbytes_per_sec": 0, 00:13:45.016 "r_mbytes_per_sec": 0, 00:13:45.016 "w_mbytes_per_sec": 0 00:13:45.016 }, 00:13:45.016 "claimed": false, 00:13:45.016 "zoned": false, 00:13:45.016 "supported_io_types": { 00:13:45.016 "read": true, 00:13:45.016 "write": true, 00:13:45.016 "unmap": true, 00:13:45.016 "flush": true, 00:13:45.016 "reset": true, 00:13:45.016 "nvme_admin": false, 00:13:45.016 "nvme_io": false, 00:13:45.016 "nvme_io_md": false, 00:13:45.016 "write_zeroes": true, 00:13:45.016 "zcopy": true, 00:13:45.016 "get_zone_info": false, 00:13:45.016 "zone_management": false, 00:13:45.016 "zone_append": false, 00:13:45.016 "compare": false, 00:13:45.016 "compare_and_write": false, 00:13:45.016 "abort": true, 00:13:45.016 "seek_hole": false, 00:13:45.016 "seek_data": false, 00:13:45.016 "copy": true, 00:13:45.016 "nvme_iov_md": false 00:13:45.016 }, 00:13:45.016 "memory_domains": [ 00:13:45.016 { 00:13:45.016 "dma_device_id": "system", 00:13:45.016 "dma_device_type": 1 00:13:45.016 }, 00:13:45.016 { 00:13:45.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.016 "dma_device_type": 2 00:13:45.016 } 00:13:45.016 ], 00:13:45.016 "driver_specific": {} 00:13:45.016 } 00:13:45.016 ] 00:13:45.016 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:45.016 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:45.016 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:45.016 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.275 [2024-07-15 13:36:32.706460] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.275 [2024-07-15 13:36:32.706497] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.275 [2024-07-15 13:36:32.706510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:45.275 [2024-07-15 13:36:32.707530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.275 "name": "Existed_Raid", 00:13:45.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.275 "strip_size_kb": 0, 00:13:45.275 "state": "configuring", 00:13:45.275 "raid_level": "raid1", 00:13:45.275 "superblock": false, 00:13:45.275 "num_base_bdevs": 3, 00:13:45.275 "num_base_bdevs_discovered": 2, 00:13:45.275 "num_base_bdevs_operational": 3, 00:13:45.275 "base_bdevs_list": [ 00:13:45.275 { 00:13:45.275 "name": "BaseBdev1", 00:13:45.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.275 "is_configured": false, 00:13:45.275 "data_offset": 0, 00:13:45.275 "data_size": 0 00:13:45.275 }, 00:13:45.275 { 00:13:45.275 "name": "BaseBdev2", 00:13:45.275 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:45.275 "is_configured": true, 00:13:45.275 "data_offset": 0, 00:13:45.275 "data_size": 65536 00:13:45.275 }, 00:13:45.275 { 00:13:45.275 "name": "BaseBdev3", 00:13:45.275 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:45.275 "is_configured": true, 00:13:45.275 "data_offset": 0, 00:13:45.275 "data_size": 65536 00:13:45.275 } 00:13:45.275 ] 00:13:45.275 }' 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.275 13:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.842 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:46.100 [2024-07-15 13:36:33.532569] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.100 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.358 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.358 "name": "Existed_Raid", 00:13:46.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.358 "strip_size_kb": 0, 00:13:46.358 "state": "configuring", 00:13:46.358 "raid_level": "raid1", 00:13:46.358 "superblock": false, 00:13:46.358 "num_base_bdevs": 3, 00:13:46.358 "num_base_bdevs_discovered": 1, 00:13:46.358 "num_base_bdevs_operational": 3, 00:13:46.358 "base_bdevs_list": [ 00:13:46.358 { 00:13:46.358 "name": "BaseBdev1", 00:13:46.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.358 "is_configured": false, 00:13:46.358 "data_offset": 0, 00:13:46.358 "data_size": 0 00:13:46.358 }, 00:13:46.358 { 00:13:46.358 "name": null, 00:13:46.358 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:46.358 "is_configured": false, 00:13:46.358 "data_offset": 0, 00:13:46.358 "data_size": 65536 00:13:46.358 }, 00:13:46.358 { 00:13:46.358 "name": "BaseBdev3", 00:13:46.358 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:46.358 "is_configured": true, 00:13:46.358 "data_offset": 0, 00:13:46.358 "data_size": 65536 00:13:46.358 } 00:13:46.358 ] 00:13:46.358 }' 00:13:46.358 13:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.358 13:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.629 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.629 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:46.888 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:46.888 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:47.147 [2024-07-15 13:36:34.547258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:47.147 BaseBdev1 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:47.147 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:47.405 [ 00:13:47.405 { 00:13:47.405 "name": "BaseBdev1", 00:13:47.405 "aliases": [ 00:13:47.405 "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1" 00:13:47.405 ], 00:13:47.405 "product_name": "Malloc disk", 00:13:47.405 "block_size": 512, 00:13:47.405 "num_blocks": 65536, 00:13:47.405 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:47.405 "assigned_rate_limits": { 00:13:47.406 "rw_ios_per_sec": 0, 00:13:47.406 "rw_mbytes_per_sec": 0, 00:13:47.406 "r_mbytes_per_sec": 0, 00:13:47.406 "w_mbytes_per_sec": 0 00:13:47.406 }, 00:13:47.406 "claimed": true, 00:13:47.406 "claim_type": "exclusive_write", 00:13:47.406 "zoned": false, 00:13:47.406 "supported_io_types": { 00:13:47.406 "read": true, 00:13:47.406 "write": true, 00:13:47.406 "unmap": true, 00:13:47.406 "flush": true, 00:13:47.406 "reset": true, 00:13:47.406 "nvme_admin": false, 00:13:47.406 "nvme_io": false, 00:13:47.406 "nvme_io_md": false, 00:13:47.406 "write_zeroes": true, 00:13:47.406 "zcopy": true, 00:13:47.406 "get_zone_info": false, 00:13:47.406 "zone_management": false, 00:13:47.406 "zone_append": false, 00:13:47.406 "compare": false, 00:13:47.406 "compare_and_write": false, 00:13:47.406 "abort": true, 00:13:47.406 "seek_hole": false, 00:13:47.406 "seek_data": false, 00:13:47.406 "copy": true, 00:13:47.406 "nvme_iov_md": false 00:13:47.406 }, 00:13:47.406 "memory_domains": [ 00:13:47.406 { 00:13:47.406 "dma_device_id": "system", 00:13:47.406 "dma_device_type": 1 00:13:47.406 }, 00:13:47.406 { 00:13:47.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.406 "dma_device_type": 2 00:13:47.406 } 00:13:47.406 ], 00:13:47.406 "driver_specific": {} 00:13:47.406 } 00:13:47.406 ] 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.406 13:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.663 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.663 "name": "Existed_Raid", 00:13:47.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.663 "strip_size_kb": 0, 00:13:47.663 "state": "configuring", 00:13:47.663 "raid_level": "raid1", 00:13:47.663 "superblock": false, 00:13:47.663 "num_base_bdevs": 3, 00:13:47.663 "num_base_bdevs_discovered": 2, 00:13:47.663 "num_base_bdevs_operational": 3, 00:13:47.663 "base_bdevs_list": [ 00:13:47.663 { 00:13:47.663 "name": "BaseBdev1", 00:13:47.663 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:47.663 "is_configured": true, 00:13:47.663 "data_offset": 0, 00:13:47.663 "data_size": 65536 00:13:47.663 }, 00:13:47.663 { 00:13:47.663 "name": null, 00:13:47.664 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:47.664 "is_configured": false, 00:13:47.664 "data_offset": 0, 00:13:47.664 "data_size": 65536 00:13:47.664 }, 00:13:47.664 { 00:13:47.664 "name": "BaseBdev3", 00:13:47.664 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:47.664 "is_configured": true, 00:13:47.664 "data_offset": 0, 00:13:47.664 "data_size": 65536 00:13:47.664 } 00:13:47.664 ] 00:13:47.664 }' 00:13:47.664 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.664 13:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.229 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.229 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:48.229 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:48.229 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:48.488 [2024-07-15 13:36:35.918805] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.488 13:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.488 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.488 "name": "Existed_Raid", 00:13:48.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.488 "strip_size_kb": 0, 00:13:48.488 "state": "configuring", 00:13:48.488 "raid_level": "raid1", 00:13:48.488 "superblock": false, 00:13:48.488 "num_base_bdevs": 3, 00:13:48.488 "num_base_bdevs_discovered": 1, 00:13:48.488 "num_base_bdevs_operational": 3, 00:13:48.488 "base_bdevs_list": [ 00:13:48.488 { 00:13:48.488 "name": "BaseBdev1", 00:13:48.488 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:48.488 "is_configured": true, 00:13:48.488 "data_offset": 0, 00:13:48.488 "data_size": 65536 00:13:48.488 }, 00:13:48.488 { 00:13:48.488 "name": null, 00:13:48.488 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:48.488 "is_configured": false, 00:13:48.488 "data_offset": 0, 00:13:48.488 "data_size": 65536 00:13:48.488 }, 00:13:48.488 { 00:13:48.488 "name": null, 00:13:48.488 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:48.488 "is_configured": false, 00:13:48.488 "data_offset": 0, 00:13:48.488 "data_size": 65536 00:13:48.488 } 00:13:48.488 ] 00:13:48.488 }' 00:13:48.488 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.488 13:36:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.055 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.055 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:49.314 [2024-07-15 13:36:36.913379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.314 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.572 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.572 13:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.572 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.572 "name": "Existed_Raid", 00:13:49.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.572 "strip_size_kb": 0, 00:13:49.572 "state": "configuring", 00:13:49.572 "raid_level": "raid1", 00:13:49.572 "superblock": false, 00:13:49.572 "num_base_bdevs": 3, 00:13:49.572 "num_base_bdevs_discovered": 2, 00:13:49.572 "num_base_bdevs_operational": 3, 00:13:49.572 "base_bdevs_list": [ 00:13:49.572 { 00:13:49.572 "name": "BaseBdev1", 00:13:49.572 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:49.572 "is_configured": true, 00:13:49.572 "data_offset": 0, 00:13:49.572 "data_size": 65536 00:13:49.572 }, 00:13:49.572 { 00:13:49.572 "name": null, 00:13:49.572 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:49.572 "is_configured": false, 00:13:49.572 "data_offset": 0, 00:13:49.572 "data_size": 65536 00:13:49.572 }, 00:13:49.572 { 00:13:49.572 "name": "BaseBdev3", 00:13:49.572 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:49.572 "is_configured": true, 00:13:49.572 "data_offset": 0, 00:13:49.572 "data_size": 65536 00:13:49.572 } 00:13:49.572 ] 00:13:49.572 }' 00:13:49.572 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.572 13:36:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.140 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.140 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:50.399 [2024-07-15 13:36:37.932018] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.399 13:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.658 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.658 "name": "Existed_Raid", 00:13:50.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.658 "strip_size_kb": 0, 00:13:50.658 "state": "configuring", 00:13:50.658 "raid_level": "raid1", 00:13:50.658 "superblock": false, 00:13:50.658 "num_base_bdevs": 3, 00:13:50.658 "num_base_bdevs_discovered": 1, 00:13:50.658 "num_base_bdevs_operational": 3, 00:13:50.658 "base_bdevs_list": [ 00:13:50.658 { 00:13:50.658 "name": null, 00:13:50.658 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:50.658 "is_configured": false, 00:13:50.658 "data_offset": 0, 00:13:50.658 "data_size": 65536 00:13:50.658 }, 00:13:50.658 { 00:13:50.658 "name": null, 00:13:50.658 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:50.658 "is_configured": false, 00:13:50.658 "data_offset": 0, 00:13:50.658 "data_size": 65536 00:13:50.658 }, 00:13:50.658 { 00:13:50.658 "name": "BaseBdev3", 00:13:50.658 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:50.658 "is_configured": true, 00:13:50.658 "data_offset": 0, 00:13:50.658 "data_size": 65536 00:13:50.658 } 00:13:50.658 ] 00:13:50.658 }' 00:13:50.658 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.658 13:36:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.223 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.223 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:51.223 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:51.223 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:51.483 [2024-07-15 13:36:38.914078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.483 13:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:51.771 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.771 "name": "Existed_Raid", 00:13:51.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:51.771 "strip_size_kb": 0, 00:13:51.771 "state": "configuring", 00:13:51.771 "raid_level": "raid1", 00:13:51.771 "superblock": false, 00:13:51.771 "num_base_bdevs": 3, 00:13:51.771 "num_base_bdevs_discovered": 2, 00:13:51.771 "num_base_bdevs_operational": 3, 00:13:51.771 "base_bdevs_list": [ 00:13:51.771 { 00:13:51.771 "name": null, 00:13:51.771 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:51.771 "is_configured": false, 00:13:51.771 "data_offset": 0, 00:13:51.771 "data_size": 65536 00:13:51.771 }, 00:13:51.771 { 00:13:51.771 "name": "BaseBdev2", 00:13:51.771 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:51.771 "is_configured": true, 00:13:51.771 "data_offset": 0, 00:13:51.771 "data_size": 65536 00:13:51.771 }, 00:13:51.771 { 00:13:51.771 "name": "BaseBdev3", 00:13:51.771 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:51.771 "is_configured": true, 00:13:51.771 "data_offset": 0, 00:13:51.771 "data_size": 65536 00:13:51.771 } 00:13:51.771 ] 00:13:51.771 }' 00:13:51.771 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.771 13:36:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:52.030 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.030 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:52.288 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:52.288 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:52.288 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.548 13:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1 00:13:52.548 [2024-07-15 13:36:40.104072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:52.548 [2024-07-15 13:36:40.104107] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x210eeb0 00:13:52.548 [2024-07-15 13:36:40.104113] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:52.548 [2024-07-15 13:36:40.104257] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x210cf20 00:13:52.548 [2024-07-15 13:36:40.104356] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x210eeb0 00:13:52.548 [2024-07-15 13:36:40.104363] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x210eeb0 00:13:52.548 [2024-07-15 13:36:40.104485] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:52.548 NewBaseBdev 00:13:52.548 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:52.548 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:52.548 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:52.548 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:52.548 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:52.548 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:52.548 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:52.825 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:53.087 [ 00:13:53.087 { 00:13:53.087 "name": "NewBaseBdev", 00:13:53.087 "aliases": [ 00:13:53.087 "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1" 00:13:53.087 ], 00:13:53.087 "product_name": "Malloc disk", 00:13:53.087 "block_size": 512, 00:13:53.087 "num_blocks": 65536, 00:13:53.087 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:53.087 "assigned_rate_limits": { 00:13:53.087 "rw_ios_per_sec": 0, 00:13:53.087 "rw_mbytes_per_sec": 0, 00:13:53.087 "r_mbytes_per_sec": 0, 00:13:53.087 "w_mbytes_per_sec": 0 00:13:53.087 }, 00:13:53.087 "claimed": true, 00:13:53.087 "claim_type": "exclusive_write", 00:13:53.087 "zoned": false, 00:13:53.087 "supported_io_types": { 00:13:53.087 "read": true, 00:13:53.087 "write": true, 00:13:53.087 "unmap": true, 00:13:53.087 "flush": true, 00:13:53.087 "reset": true, 00:13:53.087 "nvme_admin": false, 00:13:53.087 "nvme_io": false, 00:13:53.087 "nvme_io_md": false, 00:13:53.087 "write_zeroes": true, 00:13:53.087 "zcopy": true, 00:13:53.087 "get_zone_info": false, 00:13:53.087 "zone_management": false, 00:13:53.087 "zone_append": false, 00:13:53.087 "compare": false, 00:13:53.087 "compare_and_write": false, 00:13:53.087 "abort": true, 00:13:53.087 "seek_hole": false, 00:13:53.087 "seek_data": false, 00:13:53.087 "copy": true, 00:13:53.087 "nvme_iov_md": false 00:13:53.087 }, 00:13:53.087 "memory_domains": [ 00:13:53.087 { 00:13:53.087 "dma_device_id": "system", 00:13:53.087 "dma_device_type": 1 00:13:53.087 }, 00:13:53.087 { 00:13:53.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.087 "dma_device_type": 2 00:13:53.087 } 00:13:53.087 ], 00:13:53.087 "driver_specific": {} 00:13:53.087 } 00:13:53.087 ] 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.087 "name": "Existed_Raid", 00:13:53.087 "uuid": "11e26cf2-6efc-4ba5-8408-19bca68d803c", 00:13:53.087 "strip_size_kb": 0, 00:13:53.087 "state": "online", 00:13:53.087 "raid_level": "raid1", 00:13:53.087 "superblock": false, 00:13:53.087 "num_base_bdevs": 3, 00:13:53.087 "num_base_bdevs_discovered": 3, 00:13:53.087 "num_base_bdevs_operational": 3, 00:13:53.087 "base_bdevs_list": [ 00:13:53.087 { 00:13:53.087 "name": "NewBaseBdev", 00:13:53.087 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:53.087 "is_configured": true, 00:13:53.087 "data_offset": 0, 00:13:53.087 "data_size": 65536 00:13:53.087 }, 00:13:53.087 { 00:13:53.087 "name": "BaseBdev2", 00:13:53.087 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:53.087 "is_configured": true, 00:13:53.087 "data_offset": 0, 00:13:53.087 "data_size": 65536 00:13:53.087 }, 00:13:53.087 { 00:13:53.087 "name": "BaseBdev3", 00:13:53.087 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:53.087 "is_configured": true, 00:13:53.087 "data_offset": 0, 00:13:53.087 "data_size": 65536 00:13:53.087 } 00:13:53.087 ] 00:13:53.087 }' 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.087 13:36:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:53.654 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:53.919 [2024-07-15 13:36:41.323411] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:53.919 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:53.919 "name": "Existed_Raid", 00:13:53.919 "aliases": [ 00:13:53.919 "11e26cf2-6efc-4ba5-8408-19bca68d803c" 00:13:53.919 ], 00:13:53.919 "product_name": "Raid Volume", 00:13:53.919 "block_size": 512, 00:13:53.919 "num_blocks": 65536, 00:13:53.919 "uuid": "11e26cf2-6efc-4ba5-8408-19bca68d803c", 00:13:53.919 "assigned_rate_limits": { 00:13:53.919 "rw_ios_per_sec": 0, 00:13:53.919 "rw_mbytes_per_sec": 0, 00:13:53.919 "r_mbytes_per_sec": 0, 00:13:53.919 "w_mbytes_per_sec": 0 00:13:53.919 }, 00:13:53.919 "claimed": false, 00:13:53.919 "zoned": false, 00:13:53.919 "supported_io_types": { 00:13:53.919 "read": true, 00:13:53.919 "write": true, 00:13:53.919 "unmap": false, 00:13:53.919 "flush": false, 00:13:53.919 "reset": true, 00:13:53.919 "nvme_admin": false, 00:13:53.919 "nvme_io": false, 00:13:53.919 "nvme_io_md": false, 00:13:53.919 "write_zeroes": true, 00:13:53.919 "zcopy": false, 00:13:53.919 "get_zone_info": false, 00:13:53.919 "zone_management": false, 00:13:53.919 "zone_append": false, 00:13:53.919 "compare": false, 00:13:53.919 "compare_and_write": false, 00:13:53.919 "abort": false, 00:13:53.919 "seek_hole": false, 00:13:53.919 "seek_data": false, 00:13:53.919 "copy": false, 00:13:53.919 "nvme_iov_md": false 00:13:53.919 }, 00:13:53.919 "memory_domains": [ 00:13:53.919 { 00:13:53.919 "dma_device_id": "system", 00:13:53.919 "dma_device_type": 1 00:13:53.919 }, 00:13:53.919 { 00:13:53.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.919 "dma_device_type": 2 00:13:53.919 }, 00:13:53.919 { 00:13:53.919 "dma_device_id": "system", 00:13:53.919 "dma_device_type": 1 00:13:53.919 }, 00:13:53.919 { 00:13:53.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.919 "dma_device_type": 2 00:13:53.919 }, 00:13:53.919 { 00:13:53.919 "dma_device_id": "system", 00:13:53.919 "dma_device_type": 1 00:13:53.919 }, 00:13:53.919 { 00:13:53.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.919 "dma_device_type": 2 00:13:53.919 } 00:13:53.919 ], 00:13:53.919 "driver_specific": { 00:13:53.919 "raid": { 00:13:53.919 "uuid": "11e26cf2-6efc-4ba5-8408-19bca68d803c", 00:13:53.919 "strip_size_kb": 0, 00:13:53.919 "state": "online", 00:13:53.919 "raid_level": "raid1", 00:13:53.919 "superblock": false, 00:13:53.919 "num_base_bdevs": 3, 00:13:53.919 "num_base_bdevs_discovered": 3, 00:13:53.919 "num_base_bdevs_operational": 3, 00:13:53.919 "base_bdevs_list": [ 00:13:53.919 { 00:13:53.919 "name": "NewBaseBdev", 00:13:53.919 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:53.919 "is_configured": true, 00:13:53.919 "data_offset": 0, 00:13:53.919 "data_size": 65536 00:13:53.919 }, 00:13:53.919 { 00:13:53.919 "name": "BaseBdev2", 00:13:53.919 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:53.919 "is_configured": true, 00:13:53.919 "data_offset": 0, 00:13:53.919 "data_size": 65536 00:13:53.919 }, 00:13:53.919 { 00:13:53.919 "name": "BaseBdev3", 00:13:53.919 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:53.919 "is_configured": true, 00:13:53.919 "data_offset": 0, 00:13:53.919 "data_size": 65536 00:13:53.919 } 00:13:53.919 ] 00:13:53.919 } 00:13:53.919 } 00:13:53.919 }' 00:13:53.919 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:53.919 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:53.919 BaseBdev2 00:13:53.919 BaseBdev3' 00:13:53.919 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:53.919 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:53.919 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.180 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.180 "name": "NewBaseBdev", 00:13:54.180 "aliases": [ 00:13:54.180 "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1" 00:13:54.180 ], 00:13:54.180 "product_name": "Malloc disk", 00:13:54.180 "block_size": 512, 00:13:54.180 "num_blocks": 65536, 00:13:54.180 "uuid": "7d3cf6cd-f5a4-4b59-adfb-a023b98cb9b1", 00:13:54.180 "assigned_rate_limits": { 00:13:54.180 "rw_ios_per_sec": 0, 00:13:54.180 "rw_mbytes_per_sec": 0, 00:13:54.180 "r_mbytes_per_sec": 0, 00:13:54.180 "w_mbytes_per_sec": 0 00:13:54.180 }, 00:13:54.180 "claimed": true, 00:13:54.180 "claim_type": "exclusive_write", 00:13:54.180 "zoned": false, 00:13:54.180 "supported_io_types": { 00:13:54.180 "read": true, 00:13:54.180 "write": true, 00:13:54.180 "unmap": true, 00:13:54.180 "flush": true, 00:13:54.180 "reset": true, 00:13:54.180 "nvme_admin": false, 00:13:54.180 "nvme_io": false, 00:13:54.180 "nvme_io_md": false, 00:13:54.180 "write_zeroes": true, 00:13:54.180 "zcopy": true, 00:13:54.180 "get_zone_info": false, 00:13:54.180 "zone_management": false, 00:13:54.180 "zone_append": false, 00:13:54.180 "compare": false, 00:13:54.180 "compare_and_write": false, 00:13:54.180 "abort": true, 00:13:54.180 "seek_hole": false, 00:13:54.180 "seek_data": false, 00:13:54.180 "copy": true, 00:13:54.180 "nvme_iov_md": false 00:13:54.180 }, 00:13:54.180 "memory_domains": [ 00:13:54.180 { 00:13:54.180 "dma_device_id": "system", 00:13:54.181 "dma_device_type": 1 00:13:54.181 }, 00:13:54.181 { 00:13:54.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.181 "dma_device_type": 2 00:13:54.181 } 00:13:54.181 ], 00:13:54.181 "driver_specific": {} 00:13:54.181 }' 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.181 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.439 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.439 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.439 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.439 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:54.439 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:54.439 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:54.439 13:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.697 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.697 "name": "BaseBdev2", 00:13:54.697 "aliases": [ 00:13:54.697 "b68b8849-349f-44b2-a628-9d063abbd21e" 00:13:54.697 ], 00:13:54.697 "product_name": "Malloc disk", 00:13:54.697 "block_size": 512, 00:13:54.697 "num_blocks": 65536, 00:13:54.697 "uuid": "b68b8849-349f-44b2-a628-9d063abbd21e", 00:13:54.697 "assigned_rate_limits": { 00:13:54.697 "rw_ios_per_sec": 0, 00:13:54.697 "rw_mbytes_per_sec": 0, 00:13:54.697 "r_mbytes_per_sec": 0, 00:13:54.697 "w_mbytes_per_sec": 0 00:13:54.697 }, 00:13:54.697 "claimed": true, 00:13:54.697 "claim_type": "exclusive_write", 00:13:54.697 "zoned": false, 00:13:54.697 "supported_io_types": { 00:13:54.697 "read": true, 00:13:54.697 "write": true, 00:13:54.697 "unmap": true, 00:13:54.697 "flush": true, 00:13:54.697 "reset": true, 00:13:54.697 "nvme_admin": false, 00:13:54.697 "nvme_io": false, 00:13:54.697 "nvme_io_md": false, 00:13:54.697 "write_zeroes": true, 00:13:54.697 "zcopy": true, 00:13:54.697 "get_zone_info": false, 00:13:54.697 "zone_management": false, 00:13:54.697 "zone_append": false, 00:13:54.697 "compare": false, 00:13:54.697 "compare_and_write": false, 00:13:54.697 "abort": true, 00:13:54.697 "seek_hole": false, 00:13:54.697 "seek_data": false, 00:13:54.697 "copy": true, 00:13:54.697 "nvme_iov_md": false 00:13:54.697 }, 00:13:54.697 "memory_domains": [ 00:13:54.697 { 00:13:54.697 "dma_device_id": "system", 00:13:54.697 "dma_device_type": 1 00:13:54.697 }, 00:13:54.697 { 00:13:54.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.697 "dma_device_type": 2 00:13:54.697 } 00:13:54.697 ], 00:13:54.697 "driver_specific": {} 00:13:54.697 }' 00:13:54.697 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.697 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.697 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.698 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.698 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.698 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.698 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.698 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.956 "name": "BaseBdev3", 00:13:54.956 "aliases": [ 00:13:54.956 "ac00f476-5490-4711-bb44-1c665137b475" 00:13:54.956 ], 00:13:54.956 "product_name": "Malloc disk", 00:13:54.956 "block_size": 512, 00:13:54.956 "num_blocks": 65536, 00:13:54.956 "uuid": "ac00f476-5490-4711-bb44-1c665137b475", 00:13:54.956 "assigned_rate_limits": { 00:13:54.956 "rw_ios_per_sec": 0, 00:13:54.956 "rw_mbytes_per_sec": 0, 00:13:54.956 "r_mbytes_per_sec": 0, 00:13:54.956 "w_mbytes_per_sec": 0 00:13:54.956 }, 00:13:54.956 "claimed": true, 00:13:54.956 "claim_type": "exclusive_write", 00:13:54.956 "zoned": false, 00:13:54.956 "supported_io_types": { 00:13:54.956 "read": true, 00:13:54.956 "write": true, 00:13:54.956 "unmap": true, 00:13:54.956 "flush": true, 00:13:54.956 "reset": true, 00:13:54.956 "nvme_admin": false, 00:13:54.956 "nvme_io": false, 00:13:54.956 "nvme_io_md": false, 00:13:54.956 "write_zeroes": true, 00:13:54.956 "zcopy": true, 00:13:54.956 "get_zone_info": false, 00:13:54.956 "zone_management": false, 00:13:54.956 "zone_append": false, 00:13:54.956 "compare": false, 00:13:54.956 "compare_and_write": false, 00:13:54.956 "abort": true, 00:13:54.956 "seek_hole": false, 00:13:54.956 "seek_data": false, 00:13:54.956 "copy": true, 00:13:54.956 "nvme_iov_md": false 00:13:54.956 }, 00:13:54.956 "memory_domains": [ 00:13:54.956 { 00:13:54.956 "dma_device_id": "system", 00:13:54.956 "dma_device_type": 1 00:13:54.956 }, 00:13:54.956 { 00:13:54.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.956 "dma_device_type": 2 00:13:54.956 } 00:13:54.956 ], 00:13:54.956 "driver_specific": {} 00:13:54.956 }' 00:13:54.956 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:55.214 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.509 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.509 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:55.509 13:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:55.509 [2024-07-15 13:36:43.031622] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:55.509 [2024-07-15 13:36:43.031646] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:55.509 [2024-07-15 13:36:43.031688] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:55.509 [2024-07-15 13:36:43.031882] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:55.509 [2024-07-15 13:36:43.031891] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x210eeb0 name Existed_Raid, state offline 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 10011 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 10011 ']' 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 10011 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 10011 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 10011' 00:13:55.509 killing process with pid 10011 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 10011 00:13:55.509 [2024-07-15 13:36:43.088241] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:55.509 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 10011 00:13:55.795 [2024-07-15 13:36:43.113320] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:55.795 13:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:55.795 00:13:55.795 real 0m21.820s 00:13:55.795 user 0m39.705s 00:13:55.795 sys 0m4.296s 00:13:55.795 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:55.795 13:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.795 ************************************ 00:13:55.795 END TEST raid_state_function_test 00:13:55.795 ************************************ 00:13:55.795 13:36:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:55.795 13:36:43 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:13:55.795 13:36:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:55.795 13:36:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:55.795 13:36:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:55.795 ************************************ 00:13:55.795 START TEST raid_state_function_test_sb 00:13:55.795 ************************************ 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=13491 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 13491' 00:13:55.796 Process raid pid: 13491 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 13491 /var/tmp/spdk-raid.sock 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 13491 ']' 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:55.796 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:55.796 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:56.058 [2024-07-15 13:36:43.452041] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:13:56.058 [2024-07-15 13:36:43.452098] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:56.058 [2024-07-15 13:36:43.538755] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:56.058 [2024-07-15 13:36:43.626862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:56.318 [2024-07-15 13:36:43.685808] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:56.318 [2024-07-15 13:36:43.685836] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:56.887 [2024-07-15 13:36:44.419759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:56.887 [2024-07-15 13:36:44.419797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:56.887 [2024-07-15 13:36:44.419804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:56.887 [2024-07-15 13:36:44.419812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:56.887 [2024-07-15 13:36:44.419818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:56.887 [2024-07-15 13:36:44.419825] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.887 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.146 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.146 "name": "Existed_Raid", 00:13:57.146 "uuid": "57223475-366f-4827-a254-ca37f3cef566", 00:13:57.146 "strip_size_kb": 0, 00:13:57.146 "state": "configuring", 00:13:57.146 "raid_level": "raid1", 00:13:57.146 "superblock": true, 00:13:57.146 "num_base_bdevs": 3, 00:13:57.146 "num_base_bdevs_discovered": 0, 00:13:57.146 "num_base_bdevs_operational": 3, 00:13:57.146 "base_bdevs_list": [ 00:13:57.146 { 00:13:57.146 "name": "BaseBdev1", 00:13:57.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.146 "is_configured": false, 00:13:57.146 "data_offset": 0, 00:13:57.146 "data_size": 0 00:13:57.146 }, 00:13:57.146 { 00:13:57.146 "name": "BaseBdev2", 00:13:57.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.146 "is_configured": false, 00:13:57.146 "data_offset": 0, 00:13:57.146 "data_size": 0 00:13:57.146 }, 00:13:57.146 { 00:13:57.146 "name": "BaseBdev3", 00:13:57.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.146 "is_configured": false, 00:13:57.146 "data_offset": 0, 00:13:57.146 "data_size": 0 00:13:57.146 } 00:13:57.146 ] 00:13:57.146 }' 00:13:57.146 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.146 13:36:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.713 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:57.713 [2024-07-15 13:36:45.241768] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:57.713 [2024-07-15 13:36:45.241794] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21c9f50 name Existed_Raid, state configuring 00:13:57.713 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:57.972 [2024-07-15 13:36:45.418246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:57.972 [2024-07-15 13:36:45.418266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:57.972 [2024-07-15 13:36:45.418272] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:57.972 [2024-07-15 13:36:45.418279] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:57.972 [2024-07-15 13:36:45.418285] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:57.972 [2024-07-15 13:36:45.418292] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:57.972 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:58.230 [2024-07-15 13:36:45.603529] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:58.230 BaseBdev1 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.231 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:58.489 [ 00:13:58.489 { 00:13:58.489 "name": "BaseBdev1", 00:13:58.489 "aliases": [ 00:13:58.489 "c111345e-1013-47ce-8558-fdf02d1b6f5e" 00:13:58.489 ], 00:13:58.489 "product_name": "Malloc disk", 00:13:58.489 "block_size": 512, 00:13:58.489 "num_blocks": 65536, 00:13:58.489 "uuid": "c111345e-1013-47ce-8558-fdf02d1b6f5e", 00:13:58.489 "assigned_rate_limits": { 00:13:58.489 "rw_ios_per_sec": 0, 00:13:58.489 "rw_mbytes_per_sec": 0, 00:13:58.489 "r_mbytes_per_sec": 0, 00:13:58.489 "w_mbytes_per_sec": 0 00:13:58.489 }, 00:13:58.489 "claimed": true, 00:13:58.489 "claim_type": "exclusive_write", 00:13:58.489 "zoned": false, 00:13:58.489 "supported_io_types": { 00:13:58.489 "read": true, 00:13:58.489 "write": true, 00:13:58.489 "unmap": true, 00:13:58.489 "flush": true, 00:13:58.489 "reset": true, 00:13:58.489 "nvme_admin": false, 00:13:58.489 "nvme_io": false, 00:13:58.489 "nvme_io_md": false, 00:13:58.489 "write_zeroes": true, 00:13:58.489 "zcopy": true, 00:13:58.489 "get_zone_info": false, 00:13:58.489 "zone_management": false, 00:13:58.489 "zone_append": false, 00:13:58.489 "compare": false, 00:13:58.489 "compare_and_write": false, 00:13:58.489 "abort": true, 00:13:58.489 "seek_hole": false, 00:13:58.489 "seek_data": false, 00:13:58.489 "copy": true, 00:13:58.489 "nvme_iov_md": false 00:13:58.489 }, 00:13:58.489 "memory_domains": [ 00:13:58.489 { 00:13:58.489 "dma_device_id": "system", 00:13:58.489 "dma_device_type": 1 00:13:58.489 }, 00:13:58.489 { 00:13:58.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.489 "dma_device_type": 2 00:13:58.489 } 00:13:58.489 ], 00:13:58.489 "driver_specific": {} 00:13:58.489 } 00:13:58.489 ] 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.489 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.748 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.748 "name": "Existed_Raid", 00:13:58.748 "uuid": "854aa6e9-4a54-46f2-9ebb-ef01a99bd59c", 00:13:58.748 "strip_size_kb": 0, 00:13:58.748 "state": "configuring", 00:13:58.748 "raid_level": "raid1", 00:13:58.748 "superblock": true, 00:13:58.748 "num_base_bdevs": 3, 00:13:58.748 "num_base_bdevs_discovered": 1, 00:13:58.748 "num_base_bdevs_operational": 3, 00:13:58.748 "base_bdevs_list": [ 00:13:58.748 { 00:13:58.748 "name": "BaseBdev1", 00:13:58.748 "uuid": "c111345e-1013-47ce-8558-fdf02d1b6f5e", 00:13:58.748 "is_configured": true, 00:13:58.748 "data_offset": 2048, 00:13:58.748 "data_size": 63488 00:13:58.748 }, 00:13:58.748 { 00:13:58.748 "name": "BaseBdev2", 00:13:58.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.748 "is_configured": false, 00:13:58.748 "data_offset": 0, 00:13:58.748 "data_size": 0 00:13:58.748 }, 00:13:58.748 { 00:13:58.748 "name": "BaseBdev3", 00:13:58.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.748 "is_configured": false, 00:13:58.748 "data_offset": 0, 00:13:58.748 "data_size": 0 00:13:58.748 } 00:13:58.748 ] 00:13:58.748 }' 00:13:58.748 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.748 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.316 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:59.316 [2024-07-15 13:36:46.786570] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:59.316 [2024-07-15 13:36:46.786603] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21c9820 name Existed_Raid, state configuring 00:13:59.316 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:59.574 [2024-07-15 13:36:46.971083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:59.574 [2024-07-15 13:36:46.972175] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:59.575 [2024-07-15 13:36:46.972200] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:59.575 [2024-07-15 13:36:46.972207] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:59.575 [2024-07-15 13:36:46.972214] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.575 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.575 13:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.575 "name": "Existed_Raid", 00:13:59.575 "uuid": "8c08deec-e1d3-4e96-b176-f0db2b250055", 00:13:59.575 "strip_size_kb": 0, 00:13:59.575 "state": "configuring", 00:13:59.575 "raid_level": "raid1", 00:13:59.575 "superblock": true, 00:13:59.575 "num_base_bdevs": 3, 00:13:59.575 "num_base_bdevs_discovered": 1, 00:13:59.575 "num_base_bdevs_operational": 3, 00:13:59.575 "base_bdevs_list": [ 00:13:59.575 { 00:13:59.575 "name": "BaseBdev1", 00:13:59.575 "uuid": "c111345e-1013-47ce-8558-fdf02d1b6f5e", 00:13:59.575 "is_configured": true, 00:13:59.575 "data_offset": 2048, 00:13:59.575 "data_size": 63488 00:13:59.575 }, 00:13:59.575 { 00:13:59.575 "name": "BaseBdev2", 00:13:59.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.575 "is_configured": false, 00:13:59.575 "data_offset": 0, 00:13:59.575 "data_size": 0 00:13:59.575 }, 00:13:59.575 { 00:13:59.575 "name": "BaseBdev3", 00:13:59.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.575 "is_configured": false, 00:13:59.575 "data_offset": 0, 00:13:59.575 "data_size": 0 00:13:59.575 } 00:13:59.575 ] 00:13:59.575 }' 00:13:59.575 13:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.575 13:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:00.142 13:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:00.400 [2024-07-15 13:36:47.856707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:00.400 BaseBdev2 00:14:00.400 13:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:00.400 13:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:00.400 13:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:00.400 13:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:00.400 13:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:00.400 13:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:00.400 13:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:00.658 [ 00:14:00.658 { 00:14:00.658 "name": "BaseBdev2", 00:14:00.658 "aliases": [ 00:14:00.658 "bf31d1b1-c466-433f-bd1e-950d7af583b1" 00:14:00.658 ], 00:14:00.658 "product_name": "Malloc disk", 00:14:00.658 "block_size": 512, 00:14:00.658 "num_blocks": 65536, 00:14:00.658 "uuid": "bf31d1b1-c466-433f-bd1e-950d7af583b1", 00:14:00.658 "assigned_rate_limits": { 00:14:00.658 "rw_ios_per_sec": 0, 00:14:00.658 "rw_mbytes_per_sec": 0, 00:14:00.658 "r_mbytes_per_sec": 0, 00:14:00.658 "w_mbytes_per_sec": 0 00:14:00.658 }, 00:14:00.658 "claimed": true, 00:14:00.658 "claim_type": "exclusive_write", 00:14:00.658 "zoned": false, 00:14:00.658 "supported_io_types": { 00:14:00.658 "read": true, 00:14:00.658 "write": true, 00:14:00.658 "unmap": true, 00:14:00.658 "flush": true, 00:14:00.658 "reset": true, 00:14:00.658 "nvme_admin": false, 00:14:00.658 "nvme_io": false, 00:14:00.658 "nvme_io_md": false, 00:14:00.658 "write_zeroes": true, 00:14:00.658 "zcopy": true, 00:14:00.658 "get_zone_info": false, 00:14:00.658 "zone_management": false, 00:14:00.658 "zone_append": false, 00:14:00.658 "compare": false, 00:14:00.658 "compare_and_write": false, 00:14:00.658 "abort": true, 00:14:00.658 "seek_hole": false, 00:14:00.658 "seek_data": false, 00:14:00.658 "copy": true, 00:14:00.658 "nvme_iov_md": false 00:14:00.658 }, 00:14:00.658 "memory_domains": [ 00:14:00.658 { 00:14:00.658 "dma_device_id": "system", 00:14:00.658 "dma_device_type": 1 00:14:00.658 }, 00:14:00.658 { 00:14:00.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.658 "dma_device_type": 2 00:14:00.658 } 00:14:00.658 ], 00:14:00.658 "driver_specific": {} 00:14:00.658 } 00:14:00.658 ] 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.658 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.917 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.917 "name": "Existed_Raid", 00:14:00.917 "uuid": "8c08deec-e1d3-4e96-b176-f0db2b250055", 00:14:00.917 "strip_size_kb": 0, 00:14:00.917 "state": "configuring", 00:14:00.917 "raid_level": "raid1", 00:14:00.917 "superblock": true, 00:14:00.917 "num_base_bdevs": 3, 00:14:00.917 "num_base_bdevs_discovered": 2, 00:14:00.917 "num_base_bdevs_operational": 3, 00:14:00.917 "base_bdevs_list": [ 00:14:00.917 { 00:14:00.917 "name": "BaseBdev1", 00:14:00.917 "uuid": "c111345e-1013-47ce-8558-fdf02d1b6f5e", 00:14:00.917 "is_configured": true, 00:14:00.917 "data_offset": 2048, 00:14:00.917 "data_size": 63488 00:14:00.917 }, 00:14:00.917 { 00:14:00.917 "name": "BaseBdev2", 00:14:00.917 "uuid": "bf31d1b1-c466-433f-bd1e-950d7af583b1", 00:14:00.917 "is_configured": true, 00:14:00.917 "data_offset": 2048, 00:14:00.917 "data_size": 63488 00:14:00.917 }, 00:14:00.917 { 00:14:00.917 "name": "BaseBdev3", 00:14:00.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.917 "is_configured": false, 00:14:00.917 "data_offset": 0, 00:14:00.917 "data_size": 0 00:14:00.917 } 00:14:00.917 ] 00:14:00.917 }' 00:14:00.917 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.917 13:36:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.483 13:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:01.483 [2024-07-15 13:36:49.030608] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:01.483 [2024-07-15 13:36:49.030745] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21ca710 00:14:01.484 [2024-07-15 13:36:49.030754] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:01.484 [2024-07-15 13:36:49.030879] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ca3e0 00:14:01.484 [2024-07-15 13:36:49.030971] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21ca710 00:14:01.484 [2024-07-15 13:36:49.030977] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21ca710 00:14:01.484 [2024-07-15 13:36:49.031070] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:01.484 BaseBdev3 00:14:01.484 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:01.484 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:01.484 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:01.484 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:01.484 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:01.484 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:01.484 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:01.742 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:02.000 [ 00:14:02.000 { 00:14:02.000 "name": "BaseBdev3", 00:14:02.000 "aliases": [ 00:14:02.000 "268b2fbf-3293-4751-990e-6cd356c57772" 00:14:02.000 ], 00:14:02.000 "product_name": "Malloc disk", 00:14:02.000 "block_size": 512, 00:14:02.000 "num_blocks": 65536, 00:14:02.000 "uuid": "268b2fbf-3293-4751-990e-6cd356c57772", 00:14:02.000 "assigned_rate_limits": { 00:14:02.000 "rw_ios_per_sec": 0, 00:14:02.000 "rw_mbytes_per_sec": 0, 00:14:02.000 "r_mbytes_per_sec": 0, 00:14:02.000 "w_mbytes_per_sec": 0 00:14:02.000 }, 00:14:02.000 "claimed": true, 00:14:02.000 "claim_type": "exclusive_write", 00:14:02.000 "zoned": false, 00:14:02.000 "supported_io_types": { 00:14:02.000 "read": true, 00:14:02.000 "write": true, 00:14:02.000 "unmap": true, 00:14:02.000 "flush": true, 00:14:02.000 "reset": true, 00:14:02.000 "nvme_admin": false, 00:14:02.000 "nvme_io": false, 00:14:02.000 "nvme_io_md": false, 00:14:02.000 "write_zeroes": true, 00:14:02.000 "zcopy": true, 00:14:02.000 "get_zone_info": false, 00:14:02.000 "zone_management": false, 00:14:02.000 "zone_append": false, 00:14:02.000 "compare": false, 00:14:02.000 "compare_and_write": false, 00:14:02.000 "abort": true, 00:14:02.000 "seek_hole": false, 00:14:02.000 "seek_data": false, 00:14:02.000 "copy": true, 00:14:02.000 "nvme_iov_md": false 00:14:02.000 }, 00:14:02.000 "memory_domains": [ 00:14:02.000 { 00:14:02.000 "dma_device_id": "system", 00:14:02.000 "dma_device_type": 1 00:14:02.000 }, 00:14:02.000 { 00:14:02.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.000 "dma_device_type": 2 00:14:02.000 } 00:14:02.000 ], 00:14:02.000 "driver_specific": {} 00:14:02.000 } 00:14:02.000 ] 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.000 "name": "Existed_Raid", 00:14:02.000 "uuid": "8c08deec-e1d3-4e96-b176-f0db2b250055", 00:14:02.000 "strip_size_kb": 0, 00:14:02.000 "state": "online", 00:14:02.000 "raid_level": "raid1", 00:14:02.000 "superblock": true, 00:14:02.000 "num_base_bdevs": 3, 00:14:02.000 "num_base_bdevs_discovered": 3, 00:14:02.000 "num_base_bdevs_operational": 3, 00:14:02.000 "base_bdevs_list": [ 00:14:02.000 { 00:14:02.000 "name": "BaseBdev1", 00:14:02.000 "uuid": "c111345e-1013-47ce-8558-fdf02d1b6f5e", 00:14:02.000 "is_configured": true, 00:14:02.000 "data_offset": 2048, 00:14:02.000 "data_size": 63488 00:14:02.000 }, 00:14:02.000 { 00:14:02.000 "name": "BaseBdev2", 00:14:02.000 "uuid": "bf31d1b1-c466-433f-bd1e-950d7af583b1", 00:14:02.000 "is_configured": true, 00:14:02.000 "data_offset": 2048, 00:14:02.000 "data_size": 63488 00:14:02.000 }, 00:14:02.000 { 00:14:02.000 "name": "BaseBdev3", 00:14:02.000 "uuid": "268b2fbf-3293-4751-990e-6cd356c57772", 00:14:02.000 "is_configured": true, 00:14:02.000 "data_offset": 2048, 00:14:02.000 "data_size": 63488 00:14:02.000 } 00:14:02.000 ] 00:14:02.000 }' 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.000 13:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:02.566 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:02.824 [2024-07-15 13:36:50.229929] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:02.824 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:02.824 "name": "Existed_Raid", 00:14:02.824 "aliases": [ 00:14:02.824 "8c08deec-e1d3-4e96-b176-f0db2b250055" 00:14:02.824 ], 00:14:02.824 "product_name": "Raid Volume", 00:14:02.824 "block_size": 512, 00:14:02.824 "num_blocks": 63488, 00:14:02.824 "uuid": "8c08deec-e1d3-4e96-b176-f0db2b250055", 00:14:02.824 "assigned_rate_limits": { 00:14:02.824 "rw_ios_per_sec": 0, 00:14:02.824 "rw_mbytes_per_sec": 0, 00:14:02.824 "r_mbytes_per_sec": 0, 00:14:02.824 "w_mbytes_per_sec": 0 00:14:02.824 }, 00:14:02.824 "claimed": false, 00:14:02.824 "zoned": false, 00:14:02.824 "supported_io_types": { 00:14:02.824 "read": true, 00:14:02.824 "write": true, 00:14:02.824 "unmap": false, 00:14:02.824 "flush": false, 00:14:02.824 "reset": true, 00:14:02.824 "nvme_admin": false, 00:14:02.824 "nvme_io": false, 00:14:02.824 "nvme_io_md": false, 00:14:02.824 "write_zeroes": true, 00:14:02.824 "zcopy": false, 00:14:02.824 "get_zone_info": false, 00:14:02.824 "zone_management": false, 00:14:02.824 "zone_append": false, 00:14:02.824 "compare": false, 00:14:02.824 "compare_and_write": false, 00:14:02.824 "abort": false, 00:14:02.824 "seek_hole": false, 00:14:02.824 "seek_data": false, 00:14:02.824 "copy": false, 00:14:02.824 "nvme_iov_md": false 00:14:02.824 }, 00:14:02.824 "memory_domains": [ 00:14:02.824 { 00:14:02.824 "dma_device_id": "system", 00:14:02.824 "dma_device_type": 1 00:14:02.824 }, 00:14:02.824 { 00:14:02.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.824 "dma_device_type": 2 00:14:02.824 }, 00:14:02.824 { 00:14:02.824 "dma_device_id": "system", 00:14:02.824 "dma_device_type": 1 00:14:02.824 }, 00:14:02.824 { 00:14:02.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.824 "dma_device_type": 2 00:14:02.824 }, 00:14:02.824 { 00:14:02.824 "dma_device_id": "system", 00:14:02.824 "dma_device_type": 1 00:14:02.824 }, 00:14:02.824 { 00:14:02.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.824 "dma_device_type": 2 00:14:02.824 } 00:14:02.824 ], 00:14:02.824 "driver_specific": { 00:14:02.824 "raid": { 00:14:02.824 "uuid": "8c08deec-e1d3-4e96-b176-f0db2b250055", 00:14:02.824 "strip_size_kb": 0, 00:14:02.824 "state": "online", 00:14:02.824 "raid_level": "raid1", 00:14:02.824 "superblock": true, 00:14:02.824 "num_base_bdevs": 3, 00:14:02.824 "num_base_bdevs_discovered": 3, 00:14:02.824 "num_base_bdevs_operational": 3, 00:14:02.824 "base_bdevs_list": [ 00:14:02.824 { 00:14:02.824 "name": "BaseBdev1", 00:14:02.824 "uuid": "c111345e-1013-47ce-8558-fdf02d1b6f5e", 00:14:02.824 "is_configured": true, 00:14:02.824 "data_offset": 2048, 00:14:02.824 "data_size": 63488 00:14:02.824 }, 00:14:02.824 { 00:14:02.824 "name": "BaseBdev2", 00:14:02.824 "uuid": "bf31d1b1-c466-433f-bd1e-950d7af583b1", 00:14:02.824 "is_configured": true, 00:14:02.824 "data_offset": 2048, 00:14:02.824 "data_size": 63488 00:14:02.824 }, 00:14:02.824 { 00:14:02.824 "name": "BaseBdev3", 00:14:02.824 "uuid": "268b2fbf-3293-4751-990e-6cd356c57772", 00:14:02.824 "is_configured": true, 00:14:02.824 "data_offset": 2048, 00:14:02.824 "data_size": 63488 00:14:02.824 } 00:14:02.824 ] 00:14:02.824 } 00:14:02.824 } 00:14:02.824 }' 00:14:02.824 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:02.824 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:02.824 BaseBdev2 00:14:02.824 BaseBdev3' 00:14:02.824 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:02.824 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:02.824 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:03.081 "name": "BaseBdev1", 00:14:03.081 "aliases": [ 00:14:03.081 "c111345e-1013-47ce-8558-fdf02d1b6f5e" 00:14:03.081 ], 00:14:03.081 "product_name": "Malloc disk", 00:14:03.081 "block_size": 512, 00:14:03.081 "num_blocks": 65536, 00:14:03.081 "uuid": "c111345e-1013-47ce-8558-fdf02d1b6f5e", 00:14:03.081 "assigned_rate_limits": { 00:14:03.081 "rw_ios_per_sec": 0, 00:14:03.081 "rw_mbytes_per_sec": 0, 00:14:03.081 "r_mbytes_per_sec": 0, 00:14:03.081 "w_mbytes_per_sec": 0 00:14:03.081 }, 00:14:03.081 "claimed": true, 00:14:03.081 "claim_type": "exclusive_write", 00:14:03.081 "zoned": false, 00:14:03.081 "supported_io_types": { 00:14:03.081 "read": true, 00:14:03.081 "write": true, 00:14:03.081 "unmap": true, 00:14:03.081 "flush": true, 00:14:03.081 "reset": true, 00:14:03.081 "nvme_admin": false, 00:14:03.081 "nvme_io": false, 00:14:03.081 "nvme_io_md": false, 00:14:03.081 "write_zeroes": true, 00:14:03.081 "zcopy": true, 00:14:03.081 "get_zone_info": false, 00:14:03.081 "zone_management": false, 00:14:03.081 "zone_append": false, 00:14:03.081 "compare": false, 00:14:03.081 "compare_and_write": false, 00:14:03.081 "abort": true, 00:14:03.081 "seek_hole": false, 00:14:03.081 "seek_data": false, 00:14:03.081 "copy": true, 00:14:03.081 "nvme_iov_md": false 00:14:03.081 }, 00:14:03.081 "memory_domains": [ 00:14:03.081 { 00:14:03.081 "dma_device_id": "system", 00:14:03.081 "dma_device_type": 1 00:14:03.081 }, 00:14:03.081 { 00:14:03.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.081 "dma_device_type": 2 00:14:03.081 } 00:14:03.081 ], 00:14:03.081 "driver_specific": {} 00:14:03.081 }' 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:03.081 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:03.338 "name": "BaseBdev2", 00:14:03.338 "aliases": [ 00:14:03.338 "bf31d1b1-c466-433f-bd1e-950d7af583b1" 00:14:03.338 ], 00:14:03.338 "product_name": "Malloc disk", 00:14:03.338 "block_size": 512, 00:14:03.338 "num_blocks": 65536, 00:14:03.338 "uuid": "bf31d1b1-c466-433f-bd1e-950d7af583b1", 00:14:03.338 "assigned_rate_limits": { 00:14:03.338 "rw_ios_per_sec": 0, 00:14:03.338 "rw_mbytes_per_sec": 0, 00:14:03.338 "r_mbytes_per_sec": 0, 00:14:03.338 "w_mbytes_per_sec": 0 00:14:03.338 }, 00:14:03.338 "claimed": true, 00:14:03.338 "claim_type": "exclusive_write", 00:14:03.338 "zoned": false, 00:14:03.338 "supported_io_types": { 00:14:03.338 "read": true, 00:14:03.338 "write": true, 00:14:03.338 "unmap": true, 00:14:03.338 "flush": true, 00:14:03.338 "reset": true, 00:14:03.338 "nvme_admin": false, 00:14:03.338 "nvme_io": false, 00:14:03.338 "nvme_io_md": false, 00:14:03.338 "write_zeroes": true, 00:14:03.338 "zcopy": true, 00:14:03.338 "get_zone_info": false, 00:14:03.338 "zone_management": false, 00:14:03.338 "zone_append": false, 00:14:03.338 "compare": false, 00:14:03.338 "compare_and_write": false, 00:14:03.338 "abort": true, 00:14:03.338 "seek_hole": false, 00:14:03.338 "seek_data": false, 00:14:03.338 "copy": true, 00:14:03.338 "nvme_iov_md": false 00:14:03.338 }, 00:14:03.338 "memory_domains": [ 00:14:03.338 { 00:14:03.338 "dma_device_id": "system", 00:14:03.338 "dma_device_type": 1 00:14:03.338 }, 00:14:03.338 { 00:14:03.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.338 "dma_device_type": 2 00:14:03.338 } 00:14:03.338 ], 00:14:03.338 "driver_specific": {} 00:14:03.338 }' 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.338 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.596 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:03.596 13:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:03.596 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:03.853 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:03.853 "name": "BaseBdev3", 00:14:03.853 "aliases": [ 00:14:03.853 "268b2fbf-3293-4751-990e-6cd356c57772" 00:14:03.853 ], 00:14:03.853 "product_name": "Malloc disk", 00:14:03.853 "block_size": 512, 00:14:03.853 "num_blocks": 65536, 00:14:03.853 "uuid": "268b2fbf-3293-4751-990e-6cd356c57772", 00:14:03.853 "assigned_rate_limits": { 00:14:03.853 "rw_ios_per_sec": 0, 00:14:03.853 "rw_mbytes_per_sec": 0, 00:14:03.853 "r_mbytes_per_sec": 0, 00:14:03.853 "w_mbytes_per_sec": 0 00:14:03.853 }, 00:14:03.853 "claimed": true, 00:14:03.853 "claim_type": "exclusive_write", 00:14:03.853 "zoned": false, 00:14:03.853 "supported_io_types": { 00:14:03.853 "read": true, 00:14:03.853 "write": true, 00:14:03.853 "unmap": true, 00:14:03.853 "flush": true, 00:14:03.853 "reset": true, 00:14:03.853 "nvme_admin": false, 00:14:03.853 "nvme_io": false, 00:14:03.853 "nvme_io_md": false, 00:14:03.853 "write_zeroes": true, 00:14:03.853 "zcopy": true, 00:14:03.853 "get_zone_info": false, 00:14:03.853 "zone_management": false, 00:14:03.853 "zone_append": false, 00:14:03.853 "compare": false, 00:14:03.853 "compare_and_write": false, 00:14:03.853 "abort": true, 00:14:03.853 "seek_hole": false, 00:14:03.853 "seek_data": false, 00:14:03.853 "copy": true, 00:14:03.853 "nvme_iov_md": false 00:14:03.853 }, 00:14:03.853 "memory_domains": [ 00:14:03.853 { 00:14:03.854 "dma_device_id": "system", 00:14:03.854 "dma_device_type": 1 00:14:03.854 }, 00:14:03.854 { 00:14:03.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.854 "dma_device_type": 2 00:14:03.854 } 00:14:03.854 ], 00:14:03.854 "driver_specific": {} 00:14:03.854 }' 00:14:03.854 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.854 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.854 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:03.854 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.854 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:04.126 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:04.385 [2024-07-15 13:36:51.801821] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.385 "name": "Existed_Raid", 00:14:04.385 "uuid": "8c08deec-e1d3-4e96-b176-f0db2b250055", 00:14:04.385 "strip_size_kb": 0, 00:14:04.385 "state": "online", 00:14:04.385 "raid_level": "raid1", 00:14:04.385 "superblock": true, 00:14:04.385 "num_base_bdevs": 3, 00:14:04.385 "num_base_bdevs_discovered": 2, 00:14:04.385 "num_base_bdevs_operational": 2, 00:14:04.385 "base_bdevs_list": [ 00:14:04.385 { 00:14:04.385 "name": null, 00:14:04.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.385 "is_configured": false, 00:14:04.385 "data_offset": 2048, 00:14:04.385 "data_size": 63488 00:14:04.385 }, 00:14:04.385 { 00:14:04.385 "name": "BaseBdev2", 00:14:04.385 "uuid": "bf31d1b1-c466-433f-bd1e-950d7af583b1", 00:14:04.385 "is_configured": true, 00:14:04.385 "data_offset": 2048, 00:14:04.385 "data_size": 63488 00:14:04.385 }, 00:14:04.385 { 00:14:04.385 "name": "BaseBdev3", 00:14:04.385 "uuid": "268b2fbf-3293-4751-990e-6cd356c57772", 00:14:04.385 "is_configured": true, 00:14:04.385 "data_offset": 2048, 00:14:04.385 "data_size": 63488 00:14:04.385 } 00:14:04.385 ] 00:14:04.385 }' 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.385 13:36:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:04.949 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:04.949 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:04.949 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:04.949 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.206 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:05.206 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:05.206 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:05.206 [2024-07-15 13:36:52.789794] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:05.206 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:05.206 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:05.206 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.206 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:05.465 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:05.465 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:05.465 13:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:05.725 [2024-07-15 13:36:53.146503] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:05.725 [2024-07-15 13:36:53.146572] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:05.725 [2024-07-15 13:36:53.158245] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:05.725 [2024-07-15 13:36:53.158290] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:05.725 [2024-07-15 13:36:53.158299] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ca710 name Existed_Raid, state offline 00:14:05.725 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:05.725 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:05.725 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.725 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:05.983 BaseBdev2 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:05.983 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:06.239 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:06.498 [ 00:14:06.498 { 00:14:06.498 "name": "BaseBdev2", 00:14:06.498 "aliases": [ 00:14:06.498 "f1418626-c9b7-4cc8-b169-0118d731a274" 00:14:06.498 ], 00:14:06.498 "product_name": "Malloc disk", 00:14:06.498 "block_size": 512, 00:14:06.498 "num_blocks": 65536, 00:14:06.498 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:06.498 "assigned_rate_limits": { 00:14:06.498 "rw_ios_per_sec": 0, 00:14:06.498 "rw_mbytes_per_sec": 0, 00:14:06.498 "r_mbytes_per_sec": 0, 00:14:06.498 "w_mbytes_per_sec": 0 00:14:06.498 }, 00:14:06.498 "claimed": false, 00:14:06.498 "zoned": false, 00:14:06.498 "supported_io_types": { 00:14:06.498 "read": true, 00:14:06.498 "write": true, 00:14:06.498 "unmap": true, 00:14:06.498 "flush": true, 00:14:06.498 "reset": true, 00:14:06.498 "nvme_admin": false, 00:14:06.498 "nvme_io": false, 00:14:06.498 "nvme_io_md": false, 00:14:06.498 "write_zeroes": true, 00:14:06.498 "zcopy": true, 00:14:06.498 "get_zone_info": false, 00:14:06.498 "zone_management": false, 00:14:06.498 "zone_append": false, 00:14:06.498 "compare": false, 00:14:06.498 "compare_and_write": false, 00:14:06.498 "abort": true, 00:14:06.498 "seek_hole": false, 00:14:06.498 "seek_data": false, 00:14:06.498 "copy": true, 00:14:06.498 "nvme_iov_md": false 00:14:06.498 }, 00:14:06.498 "memory_domains": [ 00:14:06.498 { 00:14:06.498 "dma_device_id": "system", 00:14:06.498 "dma_device_type": 1 00:14:06.498 }, 00:14:06.498 { 00:14:06.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.498 "dma_device_type": 2 00:14:06.498 } 00:14:06.498 ], 00:14:06.498 "driver_specific": {} 00:14:06.498 } 00:14:06.498 ] 00:14:06.498 13:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:06.498 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:06.498 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:06.498 13:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:06.498 BaseBdev3 00:14:06.498 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:06.498 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:06.498 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:06.498 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:06.498 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:06.498 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:06.498 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:06.756 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:07.015 [ 00:14:07.015 { 00:14:07.015 "name": "BaseBdev3", 00:14:07.015 "aliases": [ 00:14:07.015 "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f" 00:14:07.015 ], 00:14:07.015 "product_name": "Malloc disk", 00:14:07.015 "block_size": 512, 00:14:07.015 "num_blocks": 65536, 00:14:07.015 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:07.015 "assigned_rate_limits": { 00:14:07.015 "rw_ios_per_sec": 0, 00:14:07.015 "rw_mbytes_per_sec": 0, 00:14:07.015 "r_mbytes_per_sec": 0, 00:14:07.015 "w_mbytes_per_sec": 0 00:14:07.015 }, 00:14:07.015 "claimed": false, 00:14:07.015 "zoned": false, 00:14:07.015 "supported_io_types": { 00:14:07.015 "read": true, 00:14:07.015 "write": true, 00:14:07.015 "unmap": true, 00:14:07.015 "flush": true, 00:14:07.015 "reset": true, 00:14:07.015 "nvme_admin": false, 00:14:07.015 "nvme_io": false, 00:14:07.016 "nvme_io_md": false, 00:14:07.016 "write_zeroes": true, 00:14:07.016 "zcopy": true, 00:14:07.016 "get_zone_info": false, 00:14:07.016 "zone_management": false, 00:14:07.016 "zone_append": false, 00:14:07.016 "compare": false, 00:14:07.016 "compare_and_write": false, 00:14:07.016 "abort": true, 00:14:07.016 "seek_hole": false, 00:14:07.016 "seek_data": false, 00:14:07.016 "copy": true, 00:14:07.016 "nvme_iov_md": false 00:14:07.016 }, 00:14:07.016 "memory_domains": [ 00:14:07.016 { 00:14:07.016 "dma_device_id": "system", 00:14:07.016 "dma_device_type": 1 00:14:07.016 }, 00:14:07.016 { 00:14:07.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.016 "dma_device_type": 2 00:14:07.016 } 00:14:07.016 ], 00:14:07.016 "driver_specific": {} 00:14:07.016 } 00:14:07.016 ] 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:07.016 [2024-07-15 13:36:54.570796] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:07.016 [2024-07-15 13:36:54.570835] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:07.016 [2024-07-15 13:36:54.570850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:07.016 [2024-07-15 13:36:54.571882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.016 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.275 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.275 "name": "Existed_Raid", 00:14:07.275 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:07.275 "strip_size_kb": 0, 00:14:07.275 "state": "configuring", 00:14:07.275 "raid_level": "raid1", 00:14:07.275 "superblock": true, 00:14:07.275 "num_base_bdevs": 3, 00:14:07.275 "num_base_bdevs_discovered": 2, 00:14:07.275 "num_base_bdevs_operational": 3, 00:14:07.275 "base_bdevs_list": [ 00:14:07.275 { 00:14:07.275 "name": "BaseBdev1", 00:14:07.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.275 "is_configured": false, 00:14:07.275 "data_offset": 0, 00:14:07.275 "data_size": 0 00:14:07.275 }, 00:14:07.275 { 00:14:07.275 "name": "BaseBdev2", 00:14:07.275 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:07.275 "is_configured": true, 00:14:07.275 "data_offset": 2048, 00:14:07.275 "data_size": 63488 00:14:07.275 }, 00:14:07.275 { 00:14:07.275 "name": "BaseBdev3", 00:14:07.275 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:07.275 "is_configured": true, 00:14:07.275 "data_offset": 2048, 00:14:07.275 "data_size": 63488 00:14:07.275 } 00:14:07.275 ] 00:14:07.275 }' 00:14:07.275 13:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.275 13:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:07.841 [2024-07-15 13:36:55.420952] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.841 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.100 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.100 "name": "Existed_Raid", 00:14:08.100 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:08.100 "strip_size_kb": 0, 00:14:08.100 "state": "configuring", 00:14:08.100 "raid_level": "raid1", 00:14:08.100 "superblock": true, 00:14:08.100 "num_base_bdevs": 3, 00:14:08.100 "num_base_bdevs_discovered": 1, 00:14:08.100 "num_base_bdevs_operational": 3, 00:14:08.100 "base_bdevs_list": [ 00:14:08.100 { 00:14:08.100 "name": "BaseBdev1", 00:14:08.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.100 "is_configured": false, 00:14:08.100 "data_offset": 0, 00:14:08.100 "data_size": 0 00:14:08.100 }, 00:14:08.100 { 00:14:08.100 "name": null, 00:14:08.100 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:08.100 "is_configured": false, 00:14:08.100 "data_offset": 2048, 00:14:08.100 "data_size": 63488 00:14:08.100 }, 00:14:08.100 { 00:14:08.100 "name": "BaseBdev3", 00:14:08.100 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:08.100 "is_configured": true, 00:14:08.100 "data_offset": 2048, 00:14:08.100 "data_size": 63488 00:14:08.100 } 00:14:08.100 ] 00:14:08.100 }' 00:14:08.100 13:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.100 13:36:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.690 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.690 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:08.690 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:08.690 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:08.948 [2024-07-15 13:36:56.446492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:08.948 BaseBdev1 00:14:08.948 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:08.948 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:08.948 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:08.948 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:08.948 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:08.948 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:08.948 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:09.207 [ 00:14:09.207 { 00:14:09.207 "name": "BaseBdev1", 00:14:09.207 "aliases": [ 00:14:09.207 "ff5055bb-8270-4ee3-bb7c-313cfa8df760" 00:14:09.207 ], 00:14:09.207 "product_name": "Malloc disk", 00:14:09.207 "block_size": 512, 00:14:09.207 "num_blocks": 65536, 00:14:09.207 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:09.207 "assigned_rate_limits": { 00:14:09.207 "rw_ios_per_sec": 0, 00:14:09.207 "rw_mbytes_per_sec": 0, 00:14:09.207 "r_mbytes_per_sec": 0, 00:14:09.207 "w_mbytes_per_sec": 0 00:14:09.207 }, 00:14:09.207 "claimed": true, 00:14:09.207 "claim_type": "exclusive_write", 00:14:09.207 "zoned": false, 00:14:09.207 "supported_io_types": { 00:14:09.207 "read": true, 00:14:09.207 "write": true, 00:14:09.207 "unmap": true, 00:14:09.207 "flush": true, 00:14:09.207 "reset": true, 00:14:09.207 "nvme_admin": false, 00:14:09.207 "nvme_io": false, 00:14:09.207 "nvme_io_md": false, 00:14:09.207 "write_zeroes": true, 00:14:09.207 "zcopy": true, 00:14:09.207 "get_zone_info": false, 00:14:09.207 "zone_management": false, 00:14:09.207 "zone_append": false, 00:14:09.207 "compare": false, 00:14:09.207 "compare_and_write": false, 00:14:09.207 "abort": true, 00:14:09.207 "seek_hole": false, 00:14:09.207 "seek_data": false, 00:14:09.207 "copy": true, 00:14:09.207 "nvme_iov_md": false 00:14:09.207 }, 00:14:09.207 "memory_domains": [ 00:14:09.207 { 00:14:09.207 "dma_device_id": "system", 00:14:09.207 "dma_device_type": 1 00:14:09.207 }, 00:14:09.207 { 00:14:09.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.207 "dma_device_type": 2 00:14:09.207 } 00:14:09.207 ], 00:14:09.207 "driver_specific": {} 00:14:09.207 } 00:14:09.207 ] 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.207 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.467 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.467 "name": "Existed_Raid", 00:14:09.467 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:09.467 "strip_size_kb": 0, 00:14:09.467 "state": "configuring", 00:14:09.467 "raid_level": "raid1", 00:14:09.467 "superblock": true, 00:14:09.467 "num_base_bdevs": 3, 00:14:09.467 "num_base_bdevs_discovered": 2, 00:14:09.467 "num_base_bdevs_operational": 3, 00:14:09.467 "base_bdevs_list": [ 00:14:09.467 { 00:14:09.467 "name": "BaseBdev1", 00:14:09.467 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:09.467 "is_configured": true, 00:14:09.467 "data_offset": 2048, 00:14:09.467 "data_size": 63488 00:14:09.467 }, 00:14:09.467 { 00:14:09.467 "name": null, 00:14:09.467 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:09.467 "is_configured": false, 00:14:09.467 "data_offset": 2048, 00:14:09.467 "data_size": 63488 00:14:09.467 }, 00:14:09.467 { 00:14:09.467 "name": "BaseBdev3", 00:14:09.467 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:09.467 "is_configured": true, 00:14:09.467 "data_offset": 2048, 00:14:09.467 "data_size": 63488 00:14:09.467 } 00:14:09.467 ] 00:14:09.467 }' 00:14:09.467 13:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.467 13:36:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:10.032 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:10.032 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.032 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:10.032 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:10.329 [2024-07-15 13:36:57.802023] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.329 13:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.587 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.587 "name": "Existed_Raid", 00:14:10.587 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:10.587 "strip_size_kb": 0, 00:14:10.587 "state": "configuring", 00:14:10.587 "raid_level": "raid1", 00:14:10.587 "superblock": true, 00:14:10.587 "num_base_bdevs": 3, 00:14:10.587 "num_base_bdevs_discovered": 1, 00:14:10.587 "num_base_bdevs_operational": 3, 00:14:10.587 "base_bdevs_list": [ 00:14:10.587 { 00:14:10.587 "name": "BaseBdev1", 00:14:10.587 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:10.587 "is_configured": true, 00:14:10.587 "data_offset": 2048, 00:14:10.587 "data_size": 63488 00:14:10.587 }, 00:14:10.587 { 00:14:10.587 "name": null, 00:14:10.587 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:10.587 "is_configured": false, 00:14:10.587 "data_offset": 2048, 00:14:10.587 "data_size": 63488 00:14:10.587 }, 00:14:10.587 { 00:14:10.587 "name": null, 00:14:10.587 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:10.587 "is_configured": false, 00:14:10.587 "data_offset": 2048, 00:14:10.587 "data_size": 63488 00:14:10.587 } 00:14:10.587 ] 00:14:10.587 }' 00:14:10.587 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.587 13:36:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:11.154 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:11.154 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.154 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:11.154 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:11.412 [2024-07-15 13:36:58.848726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:11.412 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:11.412 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.412 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.412 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:11.412 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:11.412 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.412 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.413 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.413 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.413 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.413 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.413 13:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.671 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.671 "name": "Existed_Raid", 00:14:11.671 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:11.671 "strip_size_kb": 0, 00:14:11.671 "state": "configuring", 00:14:11.671 "raid_level": "raid1", 00:14:11.671 "superblock": true, 00:14:11.671 "num_base_bdevs": 3, 00:14:11.671 "num_base_bdevs_discovered": 2, 00:14:11.671 "num_base_bdevs_operational": 3, 00:14:11.671 "base_bdevs_list": [ 00:14:11.671 { 00:14:11.671 "name": "BaseBdev1", 00:14:11.671 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:11.671 "is_configured": true, 00:14:11.671 "data_offset": 2048, 00:14:11.671 "data_size": 63488 00:14:11.671 }, 00:14:11.671 { 00:14:11.671 "name": null, 00:14:11.671 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:11.671 "is_configured": false, 00:14:11.671 "data_offset": 2048, 00:14:11.671 "data_size": 63488 00:14:11.671 }, 00:14:11.671 { 00:14:11.671 "name": "BaseBdev3", 00:14:11.671 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:11.671 "is_configured": true, 00:14:11.671 "data_offset": 2048, 00:14:11.671 "data_size": 63488 00:14:11.671 } 00:14:11.671 ] 00:14:11.671 }' 00:14:11.671 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.671 13:36:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:11.928 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.928 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:12.187 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:12.187 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:12.445 [2024-07-15 13:36:59.867374] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:12.445 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:12.445 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.445 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:12.445 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:12.445 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:12.446 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.446 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.446 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.446 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.446 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.446 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.446 13:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.704 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.704 "name": "Existed_Raid", 00:14:12.704 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:12.704 "strip_size_kb": 0, 00:14:12.704 "state": "configuring", 00:14:12.704 "raid_level": "raid1", 00:14:12.704 "superblock": true, 00:14:12.704 "num_base_bdevs": 3, 00:14:12.704 "num_base_bdevs_discovered": 1, 00:14:12.704 "num_base_bdevs_operational": 3, 00:14:12.704 "base_bdevs_list": [ 00:14:12.704 { 00:14:12.704 "name": null, 00:14:12.704 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:12.704 "is_configured": false, 00:14:12.704 "data_offset": 2048, 00:14:12.704 "data_size": 63488 00:14:12.704 }, 00:14:12.704 { 00:14:12.704 "name": null, 00:14:12.704 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:12.704 "is_configured": false, 00:14:12.704 "data_offset": 2048, 00:14:12.704 "data_size": 63488 00:14:12.704 }, 00:14:12.704 { 00:14:12.704 "name": "BaseBdev3", 00:14:12.704 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:12.704 "is_configured": true, 00:14:12.704 "data_offset": 2048, 00:14:12.704 "data_size": 63488 00:14:12.704 } 00:14:12.704 ] 00:14:12.704 }' 00:14:12.704 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.704 13:37:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.270 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.270 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:13.270 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:13.270 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:13.528 [2024-07-15 13:37:00.916864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.528 13:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.528 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.528 "name": "Existed_Raid", 00:14:13.528 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:13.528 "strip_size_kb": 0, 00:14:13.528 "state": "configuring", 00:14:13.528 "raid_level": "raid1", 00:14:13.528 "superblock": true, 00:14:13.528 "num_base_bdevs": 3, 00:14:13.528 "num_base_bdevs_discovered": 2, 00:14:13.528 "num_base_bdevs_operational": 3, 00:14:13.528 "base_bdevs_list": [ 00:14:13.528 { 00:14:13.528 "name": null, 00:14:13.528 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:13.528 "is_configured": false, 00:14:13.528 "data_offset": 2048, 00:14:13.528 "data_size": 63488 00:14:13.528 }, 00:14:13.528 { 00:14:13.528 "name": "BaseBdev2", 00:14:13.528 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:13.528 "is_configured": true, 00:14:13.528 "data_offset": 2048, 00:14:13.528 "data_size": 63488 00:14:13.528 }, 00:14:13.528 { 00:14:13.528 "name": "BaseBdev3", 00:14:13.528 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:13.528 "is_configured": true, 00:14:13.528 "data_offset": 2048, 00:14:13.528 "data_size": 63488 00:14:13.528 } 00:14:13.528 ] 00:14:13.528 }' 00:14:13.528 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.528 13:37:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:14.095 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.095 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:14.352 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:14.352 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.352 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:14.610 13:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ff5055bb-8270-4ee3-bb7c-313cfa8df760 00:14:14.610 [2024-07-15 13:37:02.144137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:14.610 [2024-07-15 13:37:02.144283] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x236e110 00:14:14.610 [2024-07-15 13:37:02.144293] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:14.610 [2024-07-15 13:37:02.144429] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21c9f20 00:14:14.610 [2024-07-15 13:37:02.144519] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x236e110 00:14:14.610 [2024-07-15 13:37:02.144526] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x236e110 00:14:14.610 [2024-07-15 13:37:02.144593] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:14.610 NewBaseBdev 00:14:14.610 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:14.610 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:14.610 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:14.610 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:14.610 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:14.610 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:14.610 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:14.868 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:14.868 [ 00:14:14.868 { 00:14:14.868 "name": "NewBaseBdev", 00:14:14.868 "aliases": [ 00:14:14.868 "ff5055bb-8270-4ee3-bb7c-313cfa8df760" 00:14:14.868 ], 00:14:14.868 "product_name": "Malloc disk", 00:14:14.868 "block_size": 512, 00:14:14.868 "num_blocks": 65536, 00:14:14.868 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:14.868 "assigned_rate_limits": { 00:14:14.868 "rw_ios_per_sec": 0, 00:14:14.868 "rw_mbytes_per_sec": 0, 00:14:14.868 "r_mbytes_per_sec": 0, 00:14:14.868 "w_mbytes_per_sec": 0 00:14:14.868 }, 00:14:14.868 "claimed": true, 00:14:14.868 "claim_type": "exclusive_write", 00:14:14.868 "zoned": false, 00:14:14.868 "supported_io_types": { 00:14:14.868 "read": true, 00:14:14.868 "write": true, 00:14:14.868 "unmap": true, 00:14:14.868 "flush": true, 00:14:14.868 "reset": true, 00:14:14.868 "nvme_admin": false, 00:14:14.868 "nvme_io": false, 00:14:14.868 "nvme_io_md": false, 00:14:14.868 "write_zeroes": true, 00:14:14.868 "zcopy": true, 00:14:14.868 "get_zone_info": false, 00:14:14.868 "zone_management": false, 00:14:14.868 "zone_append": false, 00:14:14.868 "compare": false, 00:14:14.868 "compare_and_write": false, 00:14:14.869 "abort": true, 00:14:14.869 "seek_hole": false, 00:14:14.869 "seek_data": false, 00:14:14.869 "copy": true, 00:14:14.869 "nvme_iov_md": false 00:14:14.869 }, 00:14:14.869 "memory_domains": [ 00:14:14.869 { 00:14:14.869 "dma_device_id": "system", 00:14:14.869 "dma_device_type": 1 00:14:14.869 }, 00:14:14.869 { 00:14:14.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.869 "dma_device_type": 2 00:14:14.869 } 00:14:14.869 ], 00:14:14.869 "driver_specific": {} 00:14:14.869 } 00:14:14.869 ] 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.127 "name": "Existed_Raid", 00:14:15.127 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:15.127 "strip_size_kb": 0, 00:14:15.127 "state": "online", 00:14:15.127 "raid_level": "raid1", 00:14:15.127 "superblock": true, 00:14:15.127 "num_base_bdevs": 3, 00:14:15.127 "num_base_bdevs_discovered": 3, 00:14:15.127 "num_base_bdevs_operational": 3, 00:14:15.127 "base_bdevs_list": [ 00:14:15.127 { 00:14:15.127 "name": "NewBaseBdev", 00:14:15.127 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:15.127 "is_configured": true, 00:14:15.127 "data_offset": 2048, 00:14:15.127 "data_size": 63488 00:14:15.127 }, 00:14:15.127 { 00:14:15.127 "name": "BaseBdev2", 00:14:15.127 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:15.127 "is_configured": true, 00:14:15.127 "data_offset": 2048, 00:14:15.127 "data_size": 63488 00:14:15.127 }, 00:14:15.127 { 00:14:15.127 "name": "BaseBdev3", 00:14:15.127 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:15.127 "is_configured": true, 00:14:15.127 "data_offset": 2048, 00:14:15.127 "data_size": 63488 00:14:15.127 } 00:14:15.127 ] 00:14:15.127 }' 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.127 13:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:15.706 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:15.706 [2024-07-15 13:37:03.315360] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:15.965 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:15.965 "name": "Existed_Raid", 00:14:15.965 "aliases": [ 00:14:15.965 "9b681326-d327-4b77-a816-fa48890e114f" 00:14:15.965 ], 00:14:15.965 "product_name": "Raid Volume", 00:14:15.965 "block_size": 512, 00:14:15.965 "num_blocks": 63488, 00:14:15.965 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:15.965 "assigned_rate_limits": { 00:14:15.965 "rw_ios_per_sec": 0, 00:14:15.965 "rw_mbytes_per_sec": 0, 00:14:15.965 "r_mbytes_per_sec": 0, 00:14:15.965 "w_mbytes_per_sec": 0 00:14:15.965 }, 00:14:15.965 "claimed": false, 00:14:15.965 "zoned": false, 00:14:15.965 "supported_io_types": { 00:14:15.965 "read": true, 00:14:15.965 "write": true, 00:14:15.965 "unmap": false, 00:14:15.965 "flush": false, 00:14:15.965 "reset": true, 00:14:15.965 "nvme_admin": false, 00:14:15.965 "nvme_io": false, 00:14:15.965 "nvme_io_md": false, 00:14:15.965 "write_zeroes": true, 00:14:15.965 "zcopy": false, 00:14:15.965 "get_zone_info": false, 00:14:15.965 "zone_management": false, 00:14:15.965 "zone_append": false, 00:14:15.965 "compare": false, 00:14:15.965 "compare_and_write": false, 00:14:15.965 "abort": false, 00:14:15.965 "seek_hole": false, 00:14:15.965 "seek_data": false, 00:14:15.965 "copy": false, 00:14:15.965 "nvme_iov_md": false 00:14:15.965 }, 00:14:15.965 "memory_domains": [ 00:14:15.965 { 00:14:15.965 "dma_device_id": "system", 00:14:15.965 "dma_device_type": 1 00:14:15.965 }, 00:14:15.965 { 00:14:15.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.965 "dma_device_type": 2 00:14:15.965 }, 00:14:15.965 { 00:14:15.965 "dma_device_id": "system", 00:14:15.965 "dma_device_type": 1 00:14:15.965 }, 00:14:15.965 { 00:14:15.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.965 "dma_device_type": 2 00:14:15.965 }, 00:14:15.965 { 00:14:15.965 "dma_device_id": "system", 00:14:15.965 "dma_device_type": 1 00:14:15.965 }, 00:14:15.965 { 00:14:15.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.965 "dma_device_type": 2 00:14:15.965 } 00:14:15.965 ], 00:14:15.965 "driver_specific": { 00:14:15.965 "raid": { 00:14:15.965 "uuid": "9b681326-d327-4b77-a816-fa48890e114f", 00:14:15.965 "strip_size_kb": 0, 00:14:15.965 "state": "online", 00:14:15.965 "raid_level": "raid1", 00:14:15.965 "superblock": true, 00:14:15.965 "num_base_bdevs": 3, 00:14:15.965 "num_base_bdevs_discovered": 3, 00:14:15.965 "num_base_bdevs_operational": 3, 00:14:15.965 "base_bdevs_list": [ 00:14:15.965 { 00:14:15.965 "name": "NewBaseBdev", 00:14:15.965 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:15.965 "is_configured": true, 00:14:15.965 "data_offset": 2048, 00:14:15.965 "data_size": 63488 00:14:15.965 }, 00:14:15.965 { 00:14:15.965 "name": "BaseBdev2", 00:14:15.965 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:15.965 "is_configured": true, 00:14:15.965 "data_offset": 2048, 00:14:15.965 "data_size": 63488 00:14:15.965 }, 00:14:15.965 { 00:14:15.965 "name": "BaseBdev3", 00:14:15.965 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:15.965 "is_configured": true, 00:14:15.965 "data_offset": 2048, 00:14:15.965 "data_size": 63488 00:14:15.965 } 00:14:15.965 ] 00:14:15.965 } 00:14:15.965 } 00:14:15.965 }' 00:14:15.965 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:15.965 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:15.965 BaseBdev2 00:14:15.965 BaseBdev3' 00:14:15.965 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:15.965 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:15.965 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.965 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.965 "name": "NewBaseBdev", 00:14:15.965 "aliases": [ 00:14:15.965 "ff5055bb-8270-4ee3-bb7c-313cfa8df760" 00:14:15.965 ], 00:14:15.965 "product_name": "Malloc disk", 00:14:15.965 "block_size": 512, 00:14:15.965 "num_blocks": 65536, 00:14:15.965 "uuid": "ff5055bb-8270-4ee3-bb7c-313cfa8df760", 00:14:15.965 "assigned_rate_limits": { 00:14:15.965 "rw_ios_per_sec": 0, 00:14:15.965 "rw_mbytes_per_sec": 0, 00:14:15.965 "r_mbytes_per_sec": 0, 00:14:15.965 "w_mbytes_per_sec": 0 00:14:15.965 }, 00:14:15.965 "claimed": true, 00:14:15.965 "claim_type": "exclusive_write", 00:14:15.965 "zoned": false, 00:14:15.965 "supported_io_types": { 00:14:15.965 "read": true, 00:14:15.965 "write": true, 00:14:15.965 "unmap": true, 00:14:15.965 "flush": true, 00:14:15.965 "reset": true, 00:14:15.965 "nvme_admin": false, 00:14:15.965 "nvme_io": false, 00:14:15.965 "nvme_io_md": false, 00:14:15.965 "write_zeroes": true, 00:14:15.965 "zcopy": true, 00:14:15.965 "get_zone_info": false, 00:14:15.965 "zone_management": false, 00:14:15.965 "zone_append": false, 00:14:15.965 "compare": false, 00:14:15.965 "compare_and_write": false, 00:14:15.965 "abort": true, 00:14:15.965 "seek_hole": false, 00:14:15.965 "seek_data": false, 00:14:15.965 "copy": true, 00:14:15.965 "nvme_iov_md": false 00:14:15.965 }, 00:14:15.965 "memory_domains": [ 00:14:15.965 { 00:14:15.965 "dma_device_id": "system", 00:14:15.966 "dma_device_type": 1 00:14:15.966 }, 00:14:15.966 { 00:14:15.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.966 "dma_device_type": 2 00:14:15.966 } 00:14:15.966 ], 00:14:15.966 "driver_specific": {} 00:14:15.966 }' 00:14:15.966 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.224 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.483 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.483 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.483 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:16.483 13:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:16.483 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:16.483 "name": "BaseBdev2", 00:14:16.483 "aliases": [ 00:14:16.483 "f1418626-c9b7-4cc8-b169-0118d731a274" 00:14:16.483 ], 00:14:16.483 "product_name": "Malloc disk", 00:14:16.483 "block_size": 512, 00:14:16.483 "num_blocks": 65536, 00:14:16.483 "uuid": "f1418626-c9b7-4cc8-b169-0118d731a274", 00:14:16.483 "assigned_rate_limits": { 00:14:16.483 "rw_ios_per_sec": 0, 00:14:16.483 "rw_mbytes_per_sec": 0, 00:14:16.483 "r_mbytes_per_sec": 0, 00:14:16.483 "w_mbytes_per_sec": 0 00:14:16.483 }, 00:14:16.483 "claimed": true, 00:14:16.483 "claim_type": "exclusive_write", 00:14:16.483 "zoned": false, 00:14:16.483 "supported_io_types": { 00:14:16.483 "read": true, 00:14:16.483 "write": true, 00:14:16.483 "unmap": true, 00:14:16.483 "flush": true, 00:14:16.483 "reset": true, 00:14:16.483 "nvme_admin": false, 00:14:16.483 "nvme_io": false, 00:14:16.483 "nvme_io_md": false, 00:14:16.483 "write_zeroes": true, 00:14:16.483 "zcopy": true, 00:14:16.483 "get_zone_info": false, 00:14:16.483 "zone_management": false, 00:14:16.483 "zone_append": false, 00:14:16.483 "compare": false, 00:14:16.483 "compare_and_write": false, 00:14:16.483 "abort": true, 00:14:16.483 "seek_hole": false, 00:14:16.483 "seek_data": false, 00:14:16.483 "copy": true, 00:14:16.483 "nvme_iov_md": false 00:14:16.483 }, 00:14:16.483 "memory_domains": [ 00:14:16.483 { 00:14:16.483 "dma_device_id": "system", 00:14:16.483 "dma_device_type": 1 00:14:16.483 }, 00:14:16.483 { 00:14:16.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.483 "dma_device_type": 2 00:14:16.483 } 00:14:16.483 ], 00:14:16.483 "driver_specific": {} 00:14:16.483 }' 00:14:16.483 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.483 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:16.742 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.000 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.000 "name": "BaseBdev3", 00:14:17.000 "aliases": [ 00:14:17.000 "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f" 00:14:17.000 ], 00:14:17.000 "product_name": "Malloc disk", 00:14:17.000 "block_size": 512, 00:14:17.000 "num_blocks": 65536, 00:14:17.000 "uuid": "8da1e087-18ff-4e5a-aefa-e09a9a55ec7f", 00:14:17.000 "assigned_rate_limits": { 00:14:17.000 "rw_ios_per_sec": 0, 00:14:17.000 "rw_mbytes_per_sec": 0, 00:14:17.000 "r_mbytes_per_sec": 0, 00:14:17.000 "w_mbytes_per_sec": 0 00:14:17.000 }, 00:14:17.000 "claimed": true, 00:14:17.000 "claim_type": "exclusive_write", 00:14:17.000 "zoned": false, 00:14:17.000 "supported_io_types": { 00:14:17.000 "read": true, 00:14:17.000 "write": true, 00:14:17.000 "unmap": true, 00:14:17.000 "flush": true, 00:14:17.000 "reset": true, 00:14:17.000 "nvme_admin": false, 00:14:17.000 "nvme_io": false, 00:14:17.000 "nvme_io_md": false, 00:14:17.000 "write_zeroes": true, 00:14:17.000 "zcopy": true, 00:14:17.000 "get_zone_info": false, 00:14:17.000 "zone_management": false, 00:14:17.000 "zone_append": false, 00:14:17.000 "compare": false, 00:14:17.000 "compare_and_write": false, 00:14:17.000 "abort": true, 00:14:17.000 "seek_hole": false, 00:14:17.000 "seek_data": false, 00:14:17.000 "copy": true, 00:14:17.000 "nvme_iov_md": false 00:14:17.000 }, 00:14:17.000 "memory_domains": [ 00:14:17.000 { 00:14:17.000 "dma_device_id": "system", 00:14:17.000 "dma_device_type": 1 00:14:17.000 }, 00:14:17.000 { 00:14:17.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.000 "dma_device_type": 2 00:14:17.000 } 00:14:17.000 ], 00:14:17.000 "driver_specific": {} 00:14:17.000 }' 00:14:17.000 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.000 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.000 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.000 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.259 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:17.518 [2024-07-15 13:37:04.943388] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:17.518 [2024-07-15 13:37:04.943412] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:17.518 [2024-07-15 13:37:04.943461] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:17.518 [2024-07-15 13:37:04.943657] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:17.518 [2024-07-15 13:37:04.943667] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x236e110 name Existed_Raid, state offline 00:14:17.518 13:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 13491 00:14:17.518 13:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 13491 ']' 00:14:17.518 13:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 13491 00:14:17.518 13:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:17.518 13:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:17.518 13:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 13491 00:14:17.518 13:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:17.518 13:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:17.518 13:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 13491' 00:14:17.518 killing process with pid 13491 00:14:17.518 13:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 13491 00:14:17.518 [2024-07-15 13:37:05.002117] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:17.518 13:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 13491 00:14:17.518 [2024-07-15 13:37:05.031123] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:17.777 13:37:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:17.777 00:14:17.777 real 0m21.850s 00:14:17.777 user 0m39.774s 00:14:17.777 sys 0m4.297s 00:14:17.777 13:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:17.777 13:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:17.777 ************************************ 00:14:17.777 END TEST raid_state_function_test_sb 00:14:17.777 ************************************ 00:14:17.777 13:37:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:17.777 13:37:05 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:14:17.777 13:37:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:17.777 13:37:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:17.777 13:37:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:17.777 ************************************ 00:14:17.777 START TEST raid_superblock_test 00:14:17.777 ************************************ 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=16931 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 16931 /var/tmp/spdk-raid.sock 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 16931 ']' 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:17.777 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:17.777 13:37:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.777 [2024-07-15 13:37:05.345453] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:14:17.777 [2024-07-15 13:37:05.345497] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid16931 ] 00:14:18.035 [2024-07-15 13:37:05.432911] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:18.035 [2024-07-15 13:37:05.520725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:18.035 [2024-07-15 13:37:05.571084] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:18.035 [2024-07-15 13:37:05.571114] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:18.603 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:18.865 malloc1 00:14:18.865 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:19.203 [2024-07-15 13:37:06.494364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:19.203 [2024-07-15 13:37:06.494403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.203 [2024-07-15 13:37:06.494419] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x162e260 00:14:19.203 [2024-07-15 13:37:06.494427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.203 [2024-07-15 13:37:06.495878] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.203 [2024-07-15 13:37:06.495901] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:19.203 pt1 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:19.203 malloc2 00:14:19.203 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:19.486 [2024-07-15 13:37:06.848350] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:19.486 [2024-07-15 13:37:06.848387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.486 [2024-07-15 13:37:06.848415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17d8310 00:14:19.486 [2024-07-15 13:37:06.848424] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.486 [2024-07-15 13:37:06.849584] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.486 [2024-07-15 13:37:06.849606] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:19.486 pt2 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:19.486 13:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:19.486 malloc3 00:14:19.486 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:19.744 [2024-07-15 13:37:07.192938] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:19.744 [2024-07-15 13:37:07.192975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.744 [2024-07-15 13:37:07.192988] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17dbe70 00:14:19.744 [2024-07-15 13:37:07.193003] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.744 [2024-07-15 13:37:07.194184] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.744 [2024-07-15 13:37:07.194207] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:19.744 pt3 00:14:19.744 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:19.744 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:19.744 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:19.744 [2024-07-15 13:37:07.357382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:19.744 [2024-07-15 13:37:07.358389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:19.744 [2024-07-15 13:37:07.358441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:19.744 [2024-07-15 13:37:07.358555] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17dce80 00:14:19.744 [2024-07-15 13:37:07.358563] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:19.744 [2024-07-15 13:37:07.358708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d7490 00:14:19.744 [2024-07-15 13:37:07.358816] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17dce80 00:14:19.744 [2024-07-15 13:37:07.358823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17dce80 00:14:19.744 [2024-07-15 13:37:07.358894] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.002 "name": "raid_bdev1", 00:14:20.002 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:20.002 "strip_size_kb": 0, 00:14:20.002 "state": "online", 00:14:20.002 "raid_level": "raid1", 00:14:20.002 "superblock": true, 00:14:20.002 "num_base_bdevs": 3, 00:14:20.002 "num_base_bdevs_discovered": 3, 00:14:20.002 "num_base_bdevs_operational": 3, 00:14:20.002 "base_bdevs_list": [ 00:14:20.002 { 00:14:20.002 "name": "pt1", 00:14:20.002 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.002 "is_configured": true, 00:14:20.002 "data_offset": 2048, 00:14:20.002 "data_size": 63488 00:14:20.002 }, 00:14:20.002 { 00:14:20.002 "name": "pt2", 00:14:20.002 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.002 "is_configured": true, 00:14:20.002 "data_offset": 2048, 00:14:20.002 "data_size": 63488 00:14:20.002 }, 00:14:20.002 { 00:14:20.002 "name": "pt3", 00:14:20.002 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:20.002 "is_configured": true, 00:14:20.002 "data_offset": 2048, 00:14:20.002 "data_size": 63488 00:14:20.002 } 00:14:20.002 ] 00:14:20.002 }' 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.002 13:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:20.568 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:20.826 [2024-07-15 13:37:08.207728] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.826 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:20.826 "name": "raid_bdev1", 00:14:20.826 "aliases": [ 00:14:20.826 "e3ac7e95-03fd-4953-99bb-e0adaca8669c" 00:14:20.826 ], 00:14:20.826 "product_name": "Raid Volume", 00:14:20.826 "block_size": 512, 00:14:20.826 "num_blocks": 63488, 00:14:20.826 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:20.826 "assigned_rate_limits": { 00:14:20.826 "rw_ios_per_sec": 0, 00:14:20.826 "rw_mbytes_per_sec": 0, 00:14:20.826 "r_mbytes_per_sec": 0, 00:14:20.826 "w_mbytes_per_sec": 0 00:14:20.826 }, 00:14:20.826 "claimed": false, 00:14:20.826 "zoned": false, 00:14:20.826 "supported_io_types": { 00:14:20.826 "read": true, 00:14:20.826 "write": true, 00:14:20.826 "unmap": false, 00:14:20.826 "flush": false, 00:14:20.826 "reset": true, 00:14:20.826 "nvme_admin": false, 00:14:20.826 "nvme_io": false, 00:14:20.826 "nvme_io_md": false, 00:14:20.826 "write_zeroes": true, 00:14:20.826 "zcopy": false, 00:14:20.826 "get_zone_info": false, 00:14:20.826 "zone_management": false, 00:14:20.826 "zone_append": false, 00:14:20.826 "compare": false, 00:14:20.826 "compare_and_write": false, 00:14:20.826 "abort": false, 00:14:20.826 "seek_hole": false, 00:14:20.826 "seek_data": false, 00:14:20.826 "copy": false, 00:14:20.826 "nvme_iov_md": false 00:14:20.826 }, 00:14:20.826 "memory_domains": [ 00:14:20.826 { 00:14:20.826 "dma_device_id": "system", 00:14:20.826 "dma_device_type": 1 00:14:20.826 }, 00:14:20.826 { 00:14:20.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.826 "dma_device_type": 2 00:14:20.826 }, 00:14:20.826 { 00:14:20.826 "dma_device_id": "system", 00:14:20.826 "dma_device_type": 1 00:14:20.826 }, 00:14:20.826 { 00:14:20.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.826 "dma_device_type": 2 00:14:20.826 }, 00:14:20.826 { 00:14:20.826 "dma_device_id": "system", 00:14:20.826 "dma_device_type": 1 00:14:20.826 }, 00:14:20.826 { 00:14:20.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.826 "dma_device_type": 2 00:14:20.826 } 00:14:20.826 ], 00:14:20.826 "driver_specific": { 00:14:20.826 "raid": { 00:14:20.826 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:20.826 "strip_size_kb": 0, 00:14:20.826 "state": "online", 00:14:20.826 "raid_level": "raid1", 00:14:20.826 "superblock": true, 00:14:20.826 "num_base_bdevs": 3, 00:14:20.826 "num_base_bdevs_discovered": 3, 00:14:20.826 "num_base_bdevs_operational": 3, 00:14:20.826 "base_bdevs_list": [ 00:14:20.826 { 00:14:20.826 "name": "pt1", 00:14:20.826 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.826 "is_configured": true, 00:14:20.826 "data_offset": 2048, 00:14:20.826 "data_size": 63488 00:14:20.826 }, 00:14:20.826 { 00:14:20.826 "name": "pt2", 00:14:20.826 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.826 "is_configured": true, 00:14:20.826 "data_offset": 2048, 00:14:20.826 "data_size": 63488 00:14:20.826 }, 00:14:20.826 { 00:14:20.826 "name": "pt3", 00:14:20.826 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:20.826 "is_configured": true, 00:14:20.827 "data_offset": 2048, 00:14:20.827 "data_size": 63488 00:14:20.827 } 00:14:20.827 ] 00:14:20.827 } 00:14:20.827 } 00:14:20.827 }' 00:14:20.827 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:20.827 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:20.827 pt2 00:14:20.827 pt3' 00:14:20.827 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.827 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:20.827 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:20.827 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:20.827 "name": "pt1", 00:14:20.827 "aliases": [ 00:14:20.827 "00000000-0000-0000-0000-000000000001" 00:14:20.827 ], 00:14:20.827 "product_name": "passthru", 00:14:20.827 "block_size": 512, 00:14:20.827 "num_blocks": 65536, 00:14:20.827 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.827 "assigned_rate_limits": { 00:14:20.827 "rw_ios_per_sec": 0, 00:14:20.827 "rw_mbytes_per_sec": 0, 00:14:20.827 "r_mbytes_per_sec": 0, 00:14:20.827 "w_mbytes_per_sec": 0 00:14:20.827 }, 00:14:20.827 "claimed": true, 00:14:20.827 "claim_type": "exclusive_write", 00:14:20.827 "zoned": false, 00:14:20.827 "supported_io_types": { 00:14:20.827 "read": true, 00:14:20.827 "write": true, 00:14:20.827 "unmap": true, 00:14:20.827 "flush": true, 00:14:20.827 "reset": true, 00:14:20.827 "nvme_admin": false, 00:14:20.827 "nvme_io": false, 00:14:20.827 "nvme_io_md": false, 00:14:20.827 "write_zeroes": true, 00:14:20.827 "zcopy": true, 00:14:20.827 "get_zone_info": false, 00:14:20.827 "zone_management": false, 00:14:20.827 "zone_append": false, 00:14:20.827 "compare": false, 00:14:20.827 "compare_and_write": false, 00:14:20.827 "abort": true, 00:14:20.827 "seek_hole": false, 00:14:20.827 "seek_data": false, 00:14:20.827 "copy": true, 00:14:20.827 "nvme_iov_md": false 00:14:20.827 }, 00:14:20.827 "memory_domains": [ 00:14:20.827 { 00:14:20.827 "dma_device_id": "system", 00:14:20.827 "dma_device_type": 1 00:14:20.827 }, 00:14:20.827 { 00:14:20.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.827 "dma_device_type": 2 00:14:20.827 } 00:14:20.827 ], 00:14:20.827 "driver_specific": { 00:14:20.827 "passthru": { 00:14:20.827 "name": "pt1", 00:14:20.827 "base_bdev_name": "malloc1" 00:14:20.827 } 00:14:20.827 } 00:14:20.827 }' 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.083 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.340 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.340 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.340 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:21.340 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:21.340 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.340 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.340 "name": "pt2", 00:14:21.340 "aliases": [ 00:14:21.340 "00000000-0000-0000-0000-000000000002" 00:14:21.340 ], 00:14:21.340 "product_name": "passthru", 00:14:21.340 "block_size": 512, 00:14:21.340 "num_blocks": 65536, 00:14:21.340 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:21.340 "assigned_rate_limits": { 00:14:21.340 "rw_ios_per_sec": 0, 00:14:21.340 "rw_mbytes_per_sec": 0, 00:14:21.340 "r_mbytes_per_sec": 0, 00:14:21.340 "w_mbytes_per_sec": 0 00:14:21.340 }, 00:14:21.340 "claimed": true, 00:14:21.340 "claim_type": "exclusive_write", 00:14:21.340 "zoned": false, 00:14:21.340 "supported_io_types": { 00:14:21.340 "read": true, 00:14:21.340 "write": true, 00:14:21.340 "unmap": true, 00:14:21.340 "flush": true, 00:14:21.340 "reset": true, 00:14:21.340 "nvme_admin": false, 00:14:21.340 "nvme_io": false, 00:14:21.340 "nvme_io_md": false, 00:14:21.340 "write_zeroes": true, 00:14:21.340 "zcopy": true, 00:14:21.340 "get_zone_info": false, 00:14:21.340 "zone_management": false, 00:14:21.340 "zone_append": false, 00:14:21.340 "compare": false, 00:14:21.340 "compare_and_write": false, 00:14:21.340 "abort": true, 00:14:21.340 "seek_hole": false, 00:14:21.340 "seek_data": false, 00:14:21.340 "copy": true, 00:14:21.340 "nvme_iov_md": false 00:14:21.340 }, 00:14:21.340 "memory_domains": [ 00:14:21.340 { 00:14:21.340 "dma_device_id": "system", 00:14:21.340 "dma_device_type": 1 00:14:21.340 }, 00:14:21.340 { 00:14:21.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.340 "dma_device_type": 2 00:14:21.340 } 00:14:21.340 ], 00:14:21.340 "driver_specific": { 00:14:21.340 "passthru": { 00:14:21.340 "name": "pt2", 00:14:21.340 "base_bdev_name": "malloc2" 00:14:21.340 } 00:14:21.340 } 00:14:21.340 }' 00:14:21.341 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.598 13:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.598 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.856 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.856 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.856 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:21.856 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:21.856 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.856 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.856 "name": "pt3", 00:14:21.856 "aliases": [ 00:14:21.856 "00000000-0000-0000-0000-000000000003" 00:14:21.856 ], 00:14:21.856 "product_name": "passthru", 00:14:21.856 "block_size": 512, 00:14:21.856 "num_blocks": 65536, 00:14:21.856 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:21.856 "assigned_rate_limits": { 00:14:21.856 "rw_ios_per_sec": 0, 00:14:21.856 "rw_mbytes_per_sec": 0, 00:14:21.856 "r_mbytes_per_sec": 0, 00:14:21.856 "w_mbytes_per_sec": 0 00:14:21.856 }, 00:14:21.856 "claimed": true, 00:14:21.856 "claim_type": "exclusive_write", 00:14:21.856 "zoned": false, 00:14:21.856 "supported_io_types": { 00:14:21.856 "read": true, 00:14:21.856 "write": true, 00:14:21.856 "unmap": true, 00:14:21.856 "flush": true, 00:14:21.856 "reset": true, 00:14:21.856 "nvme_admin": false, 00:14:21.856 "nvme_io": false, 00:14:21.856 "nvme_io_md": false, 00:14:21.856 "write_zeroes": true, 00:14:21.856 "zcopy": true, 00:14:21.856 "get_zone_info": false, 00:14:21.856 "zone_management": false, 00:14:21.856 "zone_append": false, 00:14:21.856 "compare": false, 00:14:21.856 "compare_and_write": false, 00:14:21.856 "abort": true, 00:14:21.856 "seek_hole": false, 00:14:21.856 "seek_data": false, 00:14:21.856 "copy": true, 00:14:21.856 "nvme_iov_md": false 00:14:21.856 }, 00:14:21.856 "memory_domains": [ 00:14:21.856 { 00:14:21.856 "dma_device_id": "system", 00:14:21.856 "dma_device_type": 1 00:14:21.856 }, 00:14:21.856 { 00:14:21.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.856 "dma_device_type": 2 00:14:21.856 } 00:14:21.856 ], 00:14:21.856 "driver_specific": { 00:14:21.856 "passthru": { 00:14:21.856 "name": "pt3", 00:14:21.856 "base_bdev_name": "malloc3" 00:14:21.856 } 00:14:21.856 } 00:14:21.856 }' 00:14:21.856 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.114 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.373 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.373 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.373 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:22.373 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:22.373 [2024-07-15 13:37:09.940199] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:22.373 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e3ac7e95-03fd-4953-99bb-e0adaca8669c 00:14:22.373 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z e3ac7e95-03fd-4953-99bb-e0adaca8669c ']' 00:14:22.373 13:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:22.631 [2024-07-15 13:37:10.128494] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:22.631 [2024-07-15 13:37:10.128513] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:22.631 [2024-07-15 13:37:10.128550] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:22.631 [2024-07-15 13:37:10.128597] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:22.631 [2024-07-15 13:37:10.128605] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17dce80 name raid_bdev1, state offline 00:14:22.631 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.631 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:22.889 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:22.889 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:22.889 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:22.889 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:22.889 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:22.889 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:23.147 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:23.147 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:23.405 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:23.405 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.405 13:37:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:23.405 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:23.405 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:23.405 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:23.662 [2024-07-15 13:37:11.163146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:23.662 [2024-07-15 13:37:11.164169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:23.662 [2024-07-15 13:37:11.164199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:23.662 [2024-07-15 13:37:11.164233] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:23.662 [2024-07-15 13:37:11.164263] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:23.662 [2024-07-15 13:37:11.164293] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:23.662 [2024-07-15 13:37:11.164306] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:23.662 [2024-07-15 13:37:11.164313] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d8540 name raid_bdev1, state configuring 00:14:23.662 request: 00:14:23.662 { 00:14:23.662 "name": "raid_bdev1", 00:14:23.662 "raid_level": "raid1", 00:14:23.662 "base_bdevs": [ 00:14:23.662 "malloc1", 00:14:23.662 "malloc2", 00:14:23.662 "malloc3" 00:14:23.662 ], 00:14:23.662 "superblock": false, 00:14:23.662 "method": "bdev_raid_create", 00:14:23.662 "req_id": 1 00:14:23.662 } 00:14:23.662 Got JSON-RPC error response 00:14:23.662 response: 00:14:23.662 { 00:14:23.662 "code": -17, 00:14:23.662 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:23.662 } 00:14:23.662 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:23.662 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:23.662 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:23.662 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:23.662 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.662 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:23.920 [2024-07-15 13:37:11.512039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:23.920 [2024-07-15 13:37:11.512089] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:23.920 [2024-07-15 13:37:11.512120] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17d89b0 00:14:23.920 [2024-07-15 13:37:11.512128] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:23.920 [2024-07-15 13:37:11.513343] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:23.920 [2024-07-15 13:37:11.513368] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:23.920 [2024-07-15 13:37:11.513423] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:23.920 [2024-07-15 13:37:11.513444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:23.920 pt1 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.920 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:24.178 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.178 "name": "raid_bdev1", 00:14:24.178 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:24.178 "strip_size_kb": 0, 00:14:24.178 "state": "configuring", 00:14:24.178 "raid_level": "raid1", 00:14:24.178 "superblock": true, 00:14:24.178 "num_base_bdevs": 3, 00:14:24.178 "num_base_bdevs_discovered": 1, 00:14:24.178 "num_base_bdevs_operational": 3, 00:14:24.178 "base_bdevs_list": [ 00:14:24.178 { 00:14:24.178 "name": "pt1", 00:14:24.178 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:24.178 "is_configured": true, 00:14:24.178 "data_offset": 2048, 00:14:24.178 "data_size": 63488 00:14:24.178 }, 00:14:24.178 { 00:14:24.178 "name": null, 00:14:24.178 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:24.178 "is_configured": false, 00:14:24.178 "data_offset": 2048, 00:14:24.178 "data_size": 63488 00:14:24.178 }, 00:14:24.178 { 00:14:24.178 "name": null, 00:14:24.178 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:24.178 "is_configured": false, 00:14:24.178 "data_offset": 2048, 00:14:24.178 "data_size": 63488 00:14:24.178 } 00:14:24.178 ] 00:14:24.178 }' 00:14:24.178 13:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.178 13:37:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:24.745 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:24.745 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:24.745 [2024-07-15 13:37:12.310089] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:24.745 [2024-07-15 13:37:12.310128] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.745 [2024-07-15 13:37:12.310157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17d9720 00:14:24.745 [2024-07-15 13:37:12.310166] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.745 [2024-07-15 13:37:12.310416] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.745 [2024-07-15 13:37:12.310428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:24.745 [2024-07-15 13:37:12.310474] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:24.745 [2024-07-15 13:37:12.310487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:24.745 pt2 00:14:24.745 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:25.003 [2024-07-15 13:37:12.482549] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:25.003 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.262 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.262 "name": "raid_bdev1", 00:14:25.262 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:25.262 "strip_size_kb": 0, 00:14:25.262 "state": "configuring", 00:14:25.262 "raid_level": "raid1", 00:14:25.262 "superblock": true, 00:14:25.262 "num_base_bdevs": 3, 00:14:25.262 "num_base_bdevs_discovered": 1, 00:14:25.262 "num_base_bdevs_operational": 3, 00:14:25.262 "base_bdevs_list": [ 00:14:25.262 { 00:14:25.262 "name": "pt1", 00:14:25.262 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:25.262 "is_configured": true, 00:14:25.262 "data_offset": 2048, 00:14:25.262 "data_size": 63488 00:14:25.262 }, 00:14:25.262 { 00:14:25.262 "name": null, 00:14:25.262 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:25.262 "is_configured": false, 00:14:25.262 "data_offset": 2048, 00:14:25.262 "data_size": 63488 00:14:25.262 }, 00:14:25.262 { 00:14:25.262 "name": null, 00:14:25.262 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:25.262 "is_configured": false, 00:14:25.262 "data_offset": 2048, 00:14:25.262 "data_size": 63488 00:14:25.262 } 00:14:25.262 ] 00:14:25.262 }' 00:14:25.262 13:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.262 13:37:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.828 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:25.828 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:25.828 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:25.828 [2024-07-15 13:37:13.336740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:25.828 [2024-07-15 13:37:13.336779] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:25.828 [2024-07-15 13:37:13.336794] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x162ee80 00:14:25.828 [2024-07-15 13:37:13.336803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:25.828 [2024-07-15 13:37:13.337073] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:25.828 [2024-07-15 13:37:13.337085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:25.828 [2024-07-15 13:37:13.337132] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:25.828 [2024-07-15 13:37:13.337146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:25.828 pt2 00:14:25.828 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:25.828 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:25.828 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:26.086 [2024-07-15 13:37:13.509183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:26.086 [2024-07-15 13:37:13.509215] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:26.086 [2024-07-15 13:37:13.509243] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17dd840 00:14:26.086 [2024-07-15 13:37:13.509251] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:26.086 [2024-07-15 13:37:13.509464] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:26.086 [2024-07-15 13:37:13.509475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:26.086 [2024-07-15 13:37:13.509515] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:26.086 [2024-07-15 13:37:13.509527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:26.086 [2024-07-15 13:37:13.509607] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17da240 00:14:26.086 [2024-07-15 13:37:13.509614] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:26.086 [2024-07-15 13:37:13.509732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17dc520 00:14:26.086 [2024-07-15 13:37:13.509821] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17da240 00:14:26.086 [2024-07-15 13:37:13.509828] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17da240 00:14:26.086 [2024-07-15 13:37:13.509892] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:26.086 pt3 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:26.086 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.344 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.344 "name": "raid_bdev1", 00:14:26.344 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:26.344 "strip_size_kb": 0, 00:14:26.344 "state": "online", 00:14:26.344 "raid_level": "raid1", 00:14:26.344 "superblock": true, 00:14:26.344 "num_base_bdevs": 3, 00:14:26.344 "num_base_bdevs_discovered": 3, 00:14:26.344 "num_base_bdevs_operational": 3, 00:14:26.344 "base_bdevs_list": [ 00:14:26.344 { 00:14:26.344 "name": "pt1", 00:14:26.344 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:26.344 "is_configured": true, 00:14:26.344 "data_offset": 2048, 00:14:26.344 "data_size": 63488 00:14:26.344 }, 00:14:26.344 { 00:14:26.344 "name": "pt2", 00:14:26.344 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:26.344 "is_configured": true, 00:14:26.344 "data_offset": 2048, 00:14:26.344 "data_size": 63488 00:14:26.344 }, 00:14:26.344 { 00:14:26.344 "name": "pt3", 00:14:26.344 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:26.344 "is_configured": true, 00:14:26.344 "data_offset": 2048, 00:14:26.344 "data_size": 63488 00:14:26.344 } 00:14:26.344 ] 00:14:26.344 }' 00:14:26.344 13:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.344 13:37:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:26.602 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:26.860 [2024-07-15 13:37:14.315440] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:26.860 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:26.860 "name": "raid_bdev1", 00:14:26.860 "aliases": [ 00:14:26.860 "e3ac7e95-03fd-4953-99bb-e0adaca8669c" 00:14:26.860 ], 00:14:26.860 "product_name": "Raid Volume", 00:14:26.860 "block_size": 512, 00:14:26.860 "num_blocks": 63488, 00:14:26.860 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:26.860 "assigned_rate_limits": { 00:14:26.860 "rw_ios_per_sec": 0, 00:14:26.860 "rw_mbytes_per_sec": 0, 00:14:26.860 "r_mbytes_per_sec": 0, 00:14:26.860 "w_mbytes_per_sec": 0 00:14:26.860 }, 00:14:26.860 "claimed": false, 00:14:26.860 "zoned": false, 00:14:26.860 "supported_io_types": { 00:14:26.860 "read": true, 00:14:26.860 "write": true, 00:14:26.860 "unmap": false, 00:14:26.860 "flush": false, 00:14:26.860 "reset": true, 00:14:26.860 "nvme_admin": false, 00:14:26.860 "nvme_io": false, 00:14:26.860 "nvme_io_md": false, 00:14:26.860 "write_zeroes": true, 00:14:26.860 "zcopy": false, 00:14:26.860 "get_zone_info": false, 00:14:26.860 "zone_management": false, 00:14:26.860 "zone_append": false, 00:14:26.860 "compare": false, 00:14:26.860 "compare_and_write": false, 00:14:26.860 "abort": false, 00:14:26.860 "seek_hole": false, 00:14:26.860 "seek_data": false, 00:14:26.860 "copy": false, 00:14:26.860 "nvme_iov_md": false 00:14:26.860 }, 00:14:26.860 "memory_domains": [ 00:14:26.860 { 00:14:26.860 "dma_device_id": "system", 00:14:26.860 "dma_device_type": 1 00:14:26.860 }, 00:14:26.860 { 00:14:26.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.860 "dma_device_type": 2 00:14:26.860 }, 00:14:26.860 { 00:14:26.860 "dma_device_id": "system", 00:14:26.860 "dma_device_type": 1 00:14:26.860 }, 00:14:26.860 { 00:14:26.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.860 "dma_device_type": 2 00:14:26.860 }, 00:14:26.860 { 00:14:26.860 "dma_device_id": "system", 00:14:26.860 "dma_device_type": 1 00:14:26.860 }, 00:14:26.860 { 00:14:26.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.860 "dma_device_type": 2 00:14:26.860 } 00:14:26.860 ], 00:14:26.860 "driver_specific": { 00:14:26.860 "raid": { 00:14:26.860 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:26.860 "strip_size_kb": 0, 00:14:26.860 "state": "online", 00:14:26.860 "raid_level": "raid1", 00:14:26.860 "superblock": true, 00:14:26.860 "num_base_bdevs": 3, 00:14:26.860 "num_base_bdevs_discovered": 3, 00:14:26.860 "num_base_bdevs_operational": 3, 00:14:26.860 "base_bdevs_list": [ 00:14:26.860 { 00:14:26.860 "name": "pt1", 00:14:26.860 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:26.860 "is_configured": true, 00:14:26.860 "data_offset": 2048, 00:14:26.860 "data_size": 63488 00:14:26.860 }, 00:14:26.860 { 00:14:26.860 "name": "pt2", 00:14:26.860 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:26.860 "is_configured": true, 00:14:26.860 "data_offset": 2048, 00:14:26.860 "data_size": 63488 00:14:26.860 }, 00:14:26.860 { 00:14:26.860 "name": "pt3", 00:14:26.860 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:26.860 "is_configured": true, 00:14:26.860 "data_offset": 2048, 00:14:26.860 "data_size": 63488 00:14:26.860 } 00:14:26.860 ] 00:14:26.860 } 00:14:26.860 } 00:14:26.860 }' 00:14:26.860 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:26.860 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:26.860 pt2 00:14:26.860 pt3' 00:14:26.860 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:26.860 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:26.860 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.118 "name": "pt1", 00:14:27.118 "aliases": [ 00:14:27.118 "00000000-0000-0000-0000-000000000001" 00:14:27.118 ], 00:14:27.118 "product_name": "passthru", 00:14:27.118 "block_size": 512, 00:14:27.118 "num_blocks": 65536, 00:14:27.118 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:27.118 "assigned_rate_limits": { 00:14:27.118 "rw_ios_per_sec": 0, 00:14:27.118 "rw_mbytes_per_sec": 0, 00:14:27.118 "r_mbytes_per_sec": 0, 00:14:27.118 "w_mbytes_per_sec": 0 00:14:27.118 }, 00:14:27.118 "claimed": true, 00:14:27.118 "claim_type": "exclusive_write", 00:14:27.118 "zoned": false, 00:14:27.118 "supported_io_types": { 00:14:27.118 "read": true, 00:14:27.118 "write": true, 00:14:27.118 "unmap": true, 00:14:27.118 "flush": true, 00:14:27.118 "reset": true, 00:14:27.118 "nvme_admin": false, 00:14:27.118 "nvme_io": false, 00:14:27.118 "nvme_io_md": false, 00:14:27.118 "write_zeroes": true, 00:14:27.118 "zcopy": true, 00:14:27.118 "get_zone_info": false, 00:14:27.118 "zone_management": false, 00:14:27.118 "zone_append": false, 00:14:27.118 "compare": false, 00:14:27.118 "compare_and_write": false, 00:14:27.118 "abort": true, 00:14:27.118 "seek_hole": false, 00:14:27.118 "seek_data": false, 00:14:27.118 "copy": true, 00:14:27.118 "nvme_iov_md": false 00:14:27.118 }, 00:14:27.118 "memory_domains": [ 00:14:27.118 { 00:14:27.118 "dma_device_id": "system", 00:14:27.118 "dma_device_type": 1 00:14:27.118 }, 00:14:27.118 { 00:14:27.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.118 "dma_device_type": 2 00:14:27.118 } 00:14:27.118 ], 00:14:27.118 "driver_specific": { 00:14:27.118 "passthru": { 00:14:27.118 "name": "pt1", 00:14:27.118 "base_bdev_name": "malloc1" 00:14:27.118 } 00:14:27.118 } 00:14:27.118 }' 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.118 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:27.375 13:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.631 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.631 "name": "pt2", 00:14:27.631 "aliases": [ 00:14:27.631 "00000000-0000-0000-0000-000000000002" 00:14:27.631 ], 00:14:27.631 "product_name": "passthru", 00:14:27.631 "block_size": 512, 00:14:27.632 "num_blocks": 65536, 00:14:27.632 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:27.632 "assigned_rate_limits": { 00:14:27.632 "rw_ios_per_sec": 0, 00:14:27.632 "rw_mbytes_per_sec": 0, 00:14:27.632 "r_mbytes_per_sec": 0, 00:14:27.632 "w_mbytes_per_sec": 0 00:14:27.632 }, 00:14:27.632 "claimed": true, 00:14:27.632 "claim_type": "exclusive_write", 00:14:27.632 "zoned": false, 00:14:27.632 "supported_io_types": { 00:14:27.632 "read": true, 00:14:27.632 "write": true, 00:14:27.632 "unmap": true, 00:14:27.632 "flush": true, 00:14:27.632 "reset": true, 00:14:27.632 "nvme_admin": false, 00:14:27.632 "nvme_io": false, 00:14:27.632 "nvme_io_md": false, 00:14:27.632 "write_zeroes": true, 00:14:27.632 "zcopy": true, 00:14:27.632 "get_zone_info": false, 00:14:27.632 "zone_management": false, 00:14:27.632 "zone_append": false, 00:14:27.632 "compare": false, 00:14:27.632 "compare_and_write": false, 00:14:27.632 "abort": true, 00:14:27.632 "seek_hole": false, 00:14:27.632 "seek_data": false, 00:14:27.632 "copy": true, 00:14:27.632 "nvme_iov_md": false 00:14:27.632 }, 00:14:27.632 "memory_domains": [ 00:14:27.632 { 00:14:27.632 "dma_device_id": "system", 00:14:27.632 "dma_device_type": 1 00:14:27.632 }, 00:14:27.632 { 00:14:27.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.632 "dma_device_type": 2 00:14:27.632 } 00:14:27.632 ], 00:14:27.632 "driver_specific": { 00:14:27.632 "passthru": { 00:14:27.632 "name": "pt2", 00:14:27.632 "base_bdev_name": "malloc2" 00:14:27.632 } 00:14:27.632 } 00:14:27.632 }' 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.632 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.889 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.889 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.889 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.889 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:27.889 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:27.889 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:27.889 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:28.146 "name": "pt3", 00:14:28.146 "aliases": [ 00:14:28.146 "00000000-0000-0000-0000-000000000003" 00:14:28.146 ], 00:14:28.146 "product_name": "passthru", 00:14:28.146 "block_size": 512, 00:14:28.146 "num_blocks": 65536, 00:14:28.146 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:28.146 "assigned_rate_limits": { 00:14:28.146 "rw_ios_per_sec": 0, 00:14:28.146 "rw_mbytes_per_sec": 0, 00:14:28.146 "r_mbytes_per_sec": 0, 00:14:28.146 "w_mbytes_per_sec": 0 00:14:28.146 }, 00:14:28.146 "claimed": true, 00:14:28.146 "claim_type": "exclusive_write", 00:14:28.146 "zoned": false, 00:14:28.146 "supported_io_types": { 00:14:28.146 "read": true, 00:14:28.146 "write": true, 00:14:28.146 "unmap": true, 00:14:28.146 "flush": true, 00:14:28.146 "reset": true, 00:14:28.146 "nvme_admin": false, 00:14:28.146 "nvme_io": false, 00:14:28.146 "nvme_io_md": false, 00:14:28.146 "write_zeroes": true, 00:14:28.146 "zcopy": true, 00:14:28.146 "get_zone_info": false, 00:14:28.146 "zone_management": false, 00:14:28.146 "zone_append": false, 00:14:28.146 "compare": false, 00:14:28.146 "compare_and_write": false, 00:14:28.146 "abort": true, 00:14:28.146 "seek_hole": false, 00:14:28.146 "seek_data": false, 00:14:28.146 "copy": true, 00:14:28.146 "nvme_iov_md": false 00:14:28.146 }, 00:14:28.146 "memory_domains": [ 00:14:28.146 { 00:14:28.146 "dma_device_id": "system", 00:14:28.146 "dma_device_type": 1 00:14:28.146 }, 00:14:28.146 { 00:14:28.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.146 "dma_device_type": 2 00:14:28.146 } 00:14:28.146 ], 00:14:28.146 "driver_specific": { 00:14:28.146 "passthru": { 00:14:28.146 "name": "pt3", 00:14:28.146 "base_bdev_name": "malloc3" 00:14:28.146 } 00:14:28.146 } 00:14:28.146 }' 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:28.146 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.403 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.403 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:28.403 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:28.403 13:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:28.403 [2024-07-15 13:37:15.995799] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:28.404 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' e3ac7e95-03fd-4953-99bb-e0adaca8669c '!=' e3ac7e95-03fd-4953-99bb-e0adaca8669c ']' 00:14:28.404 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:14:28.404 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:28.404 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:28.404 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:28.660 [2024-07-15 13:37:16.168083] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.660 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:28.917 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.917 "name": "raid_bdev1", 00:14:28.917 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:28.917 "strip_size_kb": 0, 00:14:28.917 "state": "online", 00:14:28.917 "raid_level": "raid1", 00:14:28.917 "superblock": true, 00:14:28.917 "num_base_bdevs": 3, 00:14:28.917 "num_base_bdevs_discovered": 2, 00:14:28.917 "num_base_bdevs_operational": 2, 00:14:28.917 "base_bdevs_list": [ 00:14:28.917 { 00:14:28.917 "name": null, 00:14:28.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.917 "is_configured": false, 00:14:28.917 "data_offset": 2048, 00:14:28.917 "data_size": 63488 00:14:28.917 }, 00:14:28.917 { 00:14:28.917 "name": "pt2", 00:14:28.917 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:28.917 "is_configured": true, 00:14:28.917 "data_offset": 2048, 00:14:28.917 "data_size": 63488 00:14:28.917 }, 00:14:28.917 { 00:14:28.917 "name": "pt3", 00:14:28.917 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:28.917 "is_configured": true, 00:14:28.917 "data_offset": 2048, 00:14:28.917 "data_size": 63488 00:14:28.917 } 00:14:28.917 ] 00:14:28.917 }' 00:14:28.917 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.917 13:37:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.481 13:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:29.481 [2024-07-15 13:37:17.018255] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:29.481 [2024-07-15 13:37:17.018278] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:29.482 [2024-07-15 13:37:17.018318] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:29.482 [2024-07-15 13:37:17.018360] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:29.482 [2024-07-15 13:37:17.018367] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17da240 name raid_bdev1, state offline 00:14:29.482 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.482 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:14:29.739 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:14:29.739 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:14:29.739 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:14:29.739 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:29.739 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:29.997 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:30.255 [2024-07-15 13:37:17.712027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:30.255 [2024-07-15 13:37:17.712067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.255 [2024-07-15 13:37:17.712097] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17dda70 00:14:30.255 [2024-07-15 13:37:17.712106] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.255 [2024-07-15 13:37:17.713286] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.255 [2024-07-15 13:37:17.713308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:30.255 [2024-07-15 13:37:17.713356] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:30.255 [2024-07-15 13:37:17.713377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:30.255 pt2 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.255 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.513 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.513 "name": "raid_bdev1", 00:14:30.513 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:30.513 "strip_size_kb": 0, 00:14:30.513 "state": "configuring", 00:14:30.513 "raid_level": "raid1", 00:14:30.513 "superblock": true, 00:14:30.513 "num_base_bdevs": 3, 00:14:30.513 "num_base_bdevs_discovered": 1, 00:14:30.513 "num_base_bdevs_operational": 2, 00:14:30.513 "base_bdevs_list": [ 00:14:30.513 { 00:14:30.513 "name": null, 00:14:30.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.513 "is_configured": false, 00:14:30.513 "data_offset": 2048, 00:14:30.513 "data_size": 63488 00:14:30.513 }, 00:14:30.513 { 00:14:30.513 "name": "pt2", 00:14:30.513 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:30.513 "is_configured": true, 00:14:30.513 "data_offset": 2048, 00:14:30.513 "data_size": 63488 00:14:30.513 }, 00:14:30.513 { 00:14:30.513 "name": null, 00:14:30.513 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:30.513 "is_configured": false, 00:14:30.513 "data_offset": 2048, 00:14:30.513 "data_size": 63488 00:14:30.513 } 00:14:30.513 ] 00:14:30.513 }' 00:14:30.513 13:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.513 13:37:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:31.080 [2024-07-15 13:37:18.566233] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:31.080 [2024-07-15 13:37:18.566275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.080 [2024-07-15 13:37:18.566289] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17df940 00:14:31.080 [2024-07-15 13:37:18.566298] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.080 [2024-07-15 13:37:18.566542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.080 [2024-07-15 13:37:18.566553] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:31.080 [2024-07-15 13:37:18.566598] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:31.080 [2024-07-15 13:37:18.566612] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:31.080 [2024-07-15 13:37:18.566681] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17da950 00:14:31.080 [2024-07-15 13:37:18.566688] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:31.080 [2024-07-15 13:37:18.566796] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d9ce0 00:14:31.080 [2024-07-15 13:37:18.566878] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17da950 00:14:31.080 [2024-07-15 13:37:18.566884] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17da950 00:14:31.080 [2024-07-15 13:37:18.566950] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.080 pt3 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.080 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:31.339 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.339 "name": "raid_bdev1", 00:14:31.339 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:31.339 "strip_size_kb": 0, 00:14:31.339 "state": "online", 00:14:31.339 "raid_level": "raid1", 00:14:31.339 "superblock": true, 00:14:31.339 "num_base_bdevs": 3, 00:14:31.339 "num_base_bdevs_discovered": 2, 00:14:31.339 "num_base_bdevs_operational": 2, 00:14:31.339 "base_bdevs_list": [ 00:14:31.339 { 00:14:31.339 "name": null, 00:14:31.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.339 "is_configured": false, 00:14:31.339 "data_offset": 2048, 00:14:31.339 "data_size": 63488 00:14:31.339 }, 00:14:31.339 { 00:14:31.339 "name": "pt2", 00:14:31.339 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:31.339 "is_configured": true, 00:14:31.339 "data_offset": 2048, 00:14:31.339 "data_size": 63488 00:14:31.339 }, 00:14:31.339 { 00:14:31.339 "name": "pt3", 00:14:31.339 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:31.339 "is_configured": true, 00:14:31.339 "data_offset": 2048, 00:14:31.339 "data_size": 63488 00:14:31.339 } 00:14:31.339 ] 00:14:31.339 }' 00:14:31.339 13:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.339 13:37:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.905 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:31.905 [2024-07-15 13:37:19.412389] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:31.905 [2024-07-15 13:37:19.412411] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:31.905 [2024-07-15 13:37:19.412451] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:31.905 [2024-07-15 13:37:19.412491] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:31.905 [2024-07-15 13:37:19.412500] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17da950 name raid_bdev1, state offline 00:14:31.905 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.905 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:14:32.163 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:14:32.163 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:14:32.163 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:14:32.163 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:14:32.163 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:32.421 [2024-07-15 13:37:19.941731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:32.421 [2024-07-15 13:37:19.941767] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:32.421 [2024-07-15 13:37:19.941794] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17df500 00:14:32.421 [2024-07-15 13:37:19.941803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:32.421 [2024-07-15 13:37:19.942949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:32.421 [2024-07-15 13:37:19.942971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:32.421 [2024-07-15 13:37:19.943025] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:32.421 [2024-07-15 13:37:19.943045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:32.421 [2024-07-15 13:37:19.943113] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:14:32.421 [2024-07-15 13:37:19.943122] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:32.421 [2024-07-15 13:37:19.943132] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x162d800 name raid_bdev1, state configuring 00:14:32.421 [2024-07-15 13:37:19.943147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:32.421 pt1 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.421 13:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:32.680 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.680 "name": "raid_bdev1", 00:14:32.680 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:32.680 "strip_size_kb": 0, 00:14:32.680 "state": "configuring", 00:14:32.680 "raid_level": "raid1", 00:14:32.680 "superblock": true, 00:14:32.680 "num_base_bdevs": 3, 00:14:32.680 "num_base_bdevs_discovered": 1, 00:14:32.680 "num_base_bdevs_operational": 2, 00:14:32.680 "base_bdevs_list": [ 00:14:32.680 { 00:14:32.680 "name": null, 00:14:32.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.680 "is_configured": false, 00:14:32.680 "data_offset": 2048, 00:14:32.680 "data_size": 63488 00:14:32.680 }, 00:14:32.680 { 00:14:32.680 "name": "pt2", 00:14:32.680 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:32.680 "is_configured": true, 00:14:32.680 "data_offset": 2048, 00:14:32.680 "data_size": 63488 00:14:32.680 }, 00:14:32.680 { 00:14:32.680 "name": null, 00:14:32.680 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:32.680 "is_configured": false, 00:14:32.680 "data_offset": 2048, 00:14:32.680 "data_size": 63488 00:14:32.680 } 00:14:32.680 ] 00:14:32.680 }' 00:14:32.680 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.680 13:37:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.247 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:33.247 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:14:33.247 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:14:33.247 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:33.506 [2024-07-15 13:37:20.960366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:33.506 [2024-07-15 13:37:20.960407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:33.506 [2024-07-15 13:37:20.960436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17df940 00:14:33.506 [2024-07-15 13:37:20.960445] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:33.506 [2024-07-15 13:37:20.960702] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:33.506 [2024-07-15 13:37:20.960714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:33.506 [2024-07-15 13:37:20.960763] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:33.506 [2024-07-15 13:37:20.960777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:33.506 [2024-07-15 13:37:20.960850] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x162e630 00:14:33.506 [2024-07-15 13:37:20.960857] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:33.506 [2024-07-15 13:37:20.960977] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16bf950 00:14:33.506 [2024-07-15 13:37:20.961083] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x162e630 00:14:33.506 [2024-07-15 13:37:20.961090] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x162e630 00:14:33.506 [2024-07-15 13:37:20.961158] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:33.506 pt3 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.506 13:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:33.766 13:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.766 "name": "raid_bdev1", 00:14:33.766 "uuid": "e3ac7e95-03fd-4953-99bb-e0adaca8669c", 00:14:33.766 "strip_size_kb": 0, 00:14:33.766 "state": "online", 00:14:33.766 "raid_level": "raid1", 00:14:33.766 "superblock": true, 00:14:33.766 "num_base_bdevs": 3, 00:14:33.766 "num_base_bdevs_discovered": 2, 00:14:33.766 "num_base_bdevs_operational": 2, 00:14:33.766 "base_bdevs_list": [ 00:14:33.766 { 00:14:33.766 "name": null, 00:14:33.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.766 "is_configured": false, 00:14:33.766 "data_offset": 2048, 00:14:33.766 "data_size": 63488 00:14:33.766 }, 00:14:33.766 { 00:14:33.766 "name": "pt2", 00:14:33.766 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:33.766 "is_configured": true, 00:14:33.766 "data_offset": 2048, 00:14:33.766 "data_size": 63488 00:14:33.766 }, 00:14:33.766 { 00:14:33.766 "name": "pt3", 00:14:33.766 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:33.766 "is_configured": true, 00:14:33.766 "data_offset": 2048, 00:14:33.766 "data_size": 63488 00:14:33.766 } 00:14:33.766 ] 00:14:33.766 }' 00:14:33.766 13:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.766 13:37:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.333 13:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:14:34.333 13:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:34.333 13:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:14:34.333 13:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:34.333 13:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:14:34.592 [2024-07-15 13:37:21.995209] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' e3ac7e95-03fd-4953-99bb-e0adaca8669c '!=' e3ac7e95-03fd-4953-99bb-e0adaca8669c ']' 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 16931 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 16931 ']' 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 16931 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 16931 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 16931' 00:14:34.592 killing process with pid 16931 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 16931 00:14:34.592 [2024-07-15 13:37:22.057111] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:34.592 [2024-07-15 13:37:22.057150] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:34.592 [2024-07-15 13:37:22.057189] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:34.592 [2024-07-15 13:37:22.057197] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x162e630 name raid_bdev1, state offline 00:14:34.592 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 16931 00:14:34.592 [2024-07-15 13:37:22.081851] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:34.850 13:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:34.850 00:14:34.850 real 0m16.958s 00:14:34.850 user 0m30.736s 00:14:34.850 sys 0m3.341s 00:14:34.850 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:34.850 13:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.850 ************************************ 00:14:34.850 END TEST raid_superblock_test 00:14:34.850 ************************************ 00:14:34.850 13:37:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:34.850 13:37:22 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:14:34.850 13:37:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:34.850 13:37:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:34.850 13:37:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:34.850 ************************************ 00:14:34.850 START TEST raid_read_error_test 00:14:34.850 ************************************ 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:34.850 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3tfUf72Mfg 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=19677 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 19677 /var/tmp/spdk-raid.sock 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 19677 ']' 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:34.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:34.851 13:37:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.851 [2024-07-15 13:37:22.405486] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:14:34.851 [2024-07-15 13:37:22.405534] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid19677 ] 00:14:35.109 [2024-07-15 13:37:22.493247] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:35.109 [2024-07-15 13:37:22.584079] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:35.109 [2024-07-15 13:37:22.646987] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:35.109 [2024-07-15 13:37:22.647020] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:35.674 13:37:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:35.674 13:37:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:35.674 13:37:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:35.674 13:37:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:35.930 BaseBdev1_malloc 00:14:35.930 13:37:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:36.187 true 00:14:36.187 13:37:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:36.187 [2024-07-15 13:37:23.721786] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:36.187 [2024-07-15 13:37:23.721822] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:36.187 [2024-07-15 13:37:23.721837] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1302990 00:14:36.187 [2024-07-15 13:37:23.721846] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:36.187 [2024-07-15 13:37:23.723283] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:36.187 [2024-07-15 13:37:23.723305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:36.187 BaseBdev1 00:14:36.187 13:37:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:36.187 13:37:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:36.445 BaseBdev2_malloc 00:14:36.445 13:37:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:36.704 true 00:14:36.704 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:36.704 [2024-07-15 13:37:24.259022] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:36.704 [2024-07-15 13:37:24.259056] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:36.704 [2024-07-15 13:37:24.259072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13071d0 00:14:36.704 [2024-07-15 13:37:24.259081] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:36.704 [2024-07-15 13:37:24.260279] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:36.704 [2024-07-15 13:37:24.260302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:36.704 BaseBdev2 00:14:36.704 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:36.704 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:36.962 BaseBdev3_malloc 00:14:36.962 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:37.221 true 00:14:37.221 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:37.221 [2024-07-15 13:37:24.781257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:37.221 [2024-07-15 13:37:24.781294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:37.221 [2024-07-15 13:37:24.781309] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1309490 00:14:37.221 [2024-07-15 13:37:24.781318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:37.221 [2024-07-15 13:37:24.782492] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:37.221 [2024-07-15 13:37:24.782515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:37.221 BaseBdev3 00:14:37.221 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:37.478 [2024-07-15 13:37:24.945708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:37.478 [2024-07-15 13:37:24.946746] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.478 [2024-07-15 13:37:24.946797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:37.478 [2024-07-15 13:37:24.946959] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x130ab40 00:14:37.478 [2024-07-15 13:37:24.946968] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:37.478 [2024-07-15 13:37:24.947133] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x130a6e0 00:14:37.478 [2024-07-15 13:37:24.947249] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x130ab40 00:14:37.478 [2024-07-15 13:37:24.947256] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x130ab40 00:14:37.478 [2024-07-15 13:37:24.947332] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.478 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.479 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.479 13:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:37.737 13:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.737 "name": "raid_bdev1", 00:14:37.737 "uuid": "e9e745aa-77f8-4ae2-813e-513b6b271439", 00:14:37.737 "strip_size_kb": 0, 00:14:37.737 "state": "online", 00:14:37.737 "raid_level": "raid1", 00:14:37.737 "superblock": true, 00:14:37.737 "num_base_bdevs": 3, 00:14:37.737 "num_base_bdevs_discovered": 3, 00:14:37.737 "num_base_bdevs_operational": 3, 00:14:37.737 "base_bdevs_list": [ 00:14:37.737 { 00:14:37.737 "name": "BaseBdev1", 00:14:37.737 "uuid": "1f1221a8-b073-5384-97f5-298691474128", 00:14:37.737 "is_configured": true, 00:14:37.737 "data_offset": 2048, 00:14:37.737 "data_size": 63488 00:14:37.737 }, 00:14:37.737 { 00:14:37.737 "name": "BaseBdev2", 00:14:37.737 "uuid": "ee525d1b-763e-538b-8b6c-468ca32340a5", 00:14:37.737 "is_configured": true, 00:14:37.737 "data_offset": 2048, 00:14:37.737 "data_size": 63488 00:14:37.737 }, 00:14:37.737 { 00:14:37.737 "name": "BaseBdev3", 00:14:37.737 "uuid": "2c74d656-d1ca-5980-a32f-35b626412951", 00:14:37.737 "is_configured": true, 00:14:37.737 "data_offset": 2048, 00:14:37.737 "data_size": 63488 00:14:37.737 } 00:14:37.737 ] 00:14:37.737 }' 00:14:37.737 13:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.737 13:37:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.303 13:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:38.303 13:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:38.303 [2024-07-15 13:37:25.727937] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11586c0 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.238 13:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:39.496 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.496 "name": "raid_bdev1", 00:14:39.496 "uuid": "e9e745aa-77f8-4ae2-813e-513b6b271439", 00:14:39.496 "strip_size_kb": 0, 00:14:39.496 "state": "online", 00:14:39.496 "raid_level": "raid1", 00:14:39.496 "superblock": true, 00:14:39.496 "num_base_bdevs": 3, 00:14:39.496 "num_base_bdevs_discovered": 3, 00:14:39.496 "num_base_bdevs_operational": 3, 00:14:39.496 "base_bdevs_list": [ 00:14:39.496 { 00:14:39.496 "name": "BaseBdev1", 00:14:39.496 "uuid": "1f1221a8-b073-5384-97f5-298691474128", 00:14:39.496 "is_configured": true, 00:14:39.496 "data_offset": 2048, 00:14:39.496 "data_size": 63488 00:14:39.496 }, 00:14:39.496 { 00:14:39.496 "name": "BaseBdev2", 00:14:39.496 "uuid": "ee525d1b-763e-538b-8b6c-468ca32340a5", 00:14:39.496 "is_configured": true, 00:14:39.496 "data_offset": 2048, 00:14:39.496 "data_size": 63488 00:14:39.496 }, 00:14:39.496 { 00:14:39.496 "name": "BaseBdev3", 00:14:39.496 "uuid": "2c74d656-d1ca-5980-a32f-35b626412951", 00:14:39.496 "is_configured": true, 00:14:39.496 "data_offset": 2048, 00:14:39.496 "data_size": 63488 00:14:39.496 } 00:14:39.496 ] 00:14:39.496 }' 00:14:39.496 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.496 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.065 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:40.065 [2024-07-15 13:37:27.655159] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:40.065 [2024-07-15 13:37:27.655199] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:40.065 [2024-07-15 13:37:27.657245] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:40.065 [2024-07-15 13:37:27.657272] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:40.065 [2024-07-15 13:37:27.657335] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:40.065 [2024-07-15 13:37:27.657343] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x130ab40 name raid_bdev1, state offline 00:14:40.065 0 00:14:40.065 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 19677 00:14:40.065 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 19677 ']' 00:14:40.065 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 19677 00:14:40.065 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:40.065 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:40.321 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 19677 00:14:40.321 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:40.321 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:40.321 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 19677' 00:14:40.321 killing process with pid 19677 00:14:40.321 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 19677 00:14:40.321 [2024-07-15 13:37:27.723179] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:40.321 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 19677 00:14:40.321 [2024-07-15 13:37:27.742580] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3tfUf72Mfg 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:40.579 00:14:40.579 real 0m5.615s 00:14:40.579 user 0m8.559s 00:14:40.579 sys 0m1.026s 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:40.579 13:37:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.579 ************************************ 00:14:40.579 END TEST raid_read_error_test 00:14:40.579 ************************************ 00:14:40.579 13:37:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:40.579 13:37:27 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:14:40.579 13:37:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:40.579 13:37:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:40.579 13:37:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:40.579 ************************************ 00:14:40.579 START TEST raid_write_error_test 00:14:40.579 ************************************ 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cj686GcKnX 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=20483 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 20483 /var/tmp/spdk-raid.sock 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 20483 ']' 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:40.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:40.579 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.579 [2024-07-15 13:37:28.100385] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:14:40.579 [2024-07-15 13:37:28.100434] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid20483 ] 00:14:40.579 [2024-07-15 13:37:28.186978] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:40.837 [2024-07-15 13:37:28.275100] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:40.837 [2024-07-15 13:37:28.328827] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:40.837 [2024-07-15 13:37:28.328857] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:41.402 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:41.402 13:37:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:41.402 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:41.402 13:37:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:41.660 BaseBdev1_malloc 00:14:41.660 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:41.660 true 00:14:41.660 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:41.918 [2024-07-15 13:37:29.399814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:41.918 [2024-07-15 13:37:29.399853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:41.918 [2024-07-15 13:37:29.399886] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111e990 00:14:41.918 [2024-07-15 13:37:29.399894] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:41.918 [2024-07-15 13:37:29.401294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:41.918 [2024-07-15 13:37:29.401317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:41.918 BaseBdev1 00:14:41.918 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:41.918 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:42.191 BaseBdev2_malloc 00:14:42.191 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:42.191 true 00:14:42.191 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:42.505 [2024-07-15 13:37:29.929203] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:42.505 [2024-07-15 13:37:29.929238] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:42.505 [2024-07-15 13:37:29.929256] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11231d0 00:14:42.505 [2024-07-15 13:37:29.929264] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:42.505 [2024-07-15 13:37:29.930426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:42.506 [2024-07-15 13:37:29.930449] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:42.506 BaseBdev2 00:14:42.506 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:42.506 13:37:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:42.506 BaseBdev3_malloc 00:14:42.506 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:42.764 true 00:14:42.764 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:43.022 [2024-07-15 13:37:30.463498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:43.022 [2024-07-15 13:37:30.463535] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:43.022 [2024-07-15 13:37:30.463551] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1125490 00:14:43.022 [2024-07-15 13:37:30.463560] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:43.022 [2024-07-15 13:37:30.464781] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:43.022 [2024-07-15 13:37:30.464804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:43.022 BaseBdev3 00:14:43.022 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:43.280 [2024-07-15 13:37:30.644006] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:43.280 [2024-07-15 13:37:30.645021] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:43.280 [2024-07-15 13:37:30.645071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:43.280 [2024-07-15 13:37:30.645225] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1126b40 00:14:43.280 [2024-07-15 13:37:30.645236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:43.280 [2024-07-15 13:37:30.645379] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11266e0 00:14:43.280 [2024-07-15 13:37:30.645486] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1126b40 00:14:43.280 [2024-07-15 13:37:30.645493] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1126b40 00:14:43.280 [2024-07-15 13:37:30.645565] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:43.280 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.280 "name": "raid_bdev1", 00:14:43.280 "uuid": "edd86d7b-6113-423c-8379-e001b2e2035e", 00:14:43.280 "strip_size_kb": 0, 00:14:43.280 "state": "online", 00:14:43.280 "raid_level": "raid1", 00:14:43.280 "superblock": true, 00:14:43.281 "num_base_bdevs": 3, 00:14:43.281 "num_base_bdevs_discovered": 3, 00:14:43.281 "num_base_bdevs_operational": 3, 00:14:43.281 "base_bdevs_list": [ 00:14:43.281 { 00:14:43.281 "name": "BaseBdev1", 00:14:43.281 "uuid": "7d6f1ce7-2633-501e-a9c1-04446a878eca", 00:14:43.281 "is_configured": true, 00:14:43.281 "data_offset": 2048, 00:14:43.281 "data_size": 63488 00:14:43.281 }, 00:14:43.281 { 00:14:43.281 "name": "BaseBdev2", 00:14:43.281 "uuid": "d35eb94f-6b82-55cb-b7ce-75a986afd7cb", 00:14:43.281 "is_configured": true, 00:14:43.281 "data_offset": 2048, 00:14:43.281 "data_size": 63488 00:14:43.281 }, 00:14:43.281 { 00:14:43.281 "name": "BaseBdev3", 00:14:43.281 "uuid": "58f42923-edcd-5bb8-ac97-e512e9f3e3fc", 00:14:43.281 "is_configured": true, 00:14:43.281 "data_offset": 2048, 00:14:43.281 "data_size": 63488 00:14:43.281 } 00:14:43.281 ] 00:14:43.281 }' 00:14:43.281 13:37:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.281 13:37:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.846 13:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:43.846 13:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:43.846 [2024-07-15 13:37:31.346034] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf746c0 00:14:44.782 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:45.038 [2024-07-15 13:37:32.439076] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:14:45.038 [2024-07-15 13:37:32.439121] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:45.038 [2024-07-15 13:37:32.439292] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf746c0 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:45.038 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.296 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.296 "name": "raid_bdev1", 00:14:45.296 "uuid": "edd86d7b-6113-423c-8379-e001b2e2035e", 00:14:45.296 "strip_size_kb": 0, 00:14:45.296 "state": "online", 00:14:45.296 "raid_level": "raid1", 00:14:45.296 "superblock": true, 00:14:45.296 "num_base_bdevs": 3, 00:14:45.296 "num_base_bdevs_discovered": 2, 00:14:45.296 "num_base_bdevs_operational": 2, 00:14:45.296 "base_bdevs_list": [ 00:14:45.296 { 00:14:45.296 "name": null, 00:14:45.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.296 "is_configured": false, 00:14:45.296 "data_offset": 2048, 00:14:45.296 "data_size": 63488 00:14:45.296 }, 00:14:45.296 { 00:14:45.296 "name": "BaseBdev2", 00:14:45.296 "uuid": "d35eb94f-6b82-55cb-b7ce-75a986afd7cb", 00:14:45.296 "is_configured": true, 00:14:45.296 "data_offset": 2048, 00:14:45.296 "data_size": 63488 00:14:45.296 }, 00:14:45.296 { 00:14:45.296 "name": "BaseBdev3", 00:14:45.296 "uuid": "58f42923-edcd-5bb8-ac97-e512e9f3e3fc", 00:14:45.296 "is_configured": true, 00:14:45.296 "data_offset": 2048, 00:14:45.296 "data_size": 63488 00:14:45.296 } 00:14:45.296 ] 00:14:45.296 }' 00:14:45.296 13:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.296 13:37:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:45.862 [2024-07-15 13:37:33.383637] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:45.862 [2024-07-15 13:37:33.383668] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:45.862 [2024-07-15 13:37:33.385697] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:45.862 [2024-07-15 13:37:33.385720] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:45.862 [2024-07-15 13:37:33.385770] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:45.862 [2024-07-15 13:37:33.385779] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1126b40 name raid_bdev1, state offline 00:14:45.862 0 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 20483 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 20483 ']' 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 20483 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 20483 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 20483' 00:14:45.862 killing process with pid 20483 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 20483 00:14:45.862 [2024-07-15 13:37:33.451984] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:45.862 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 20483 00:14:45.862 [2024-07-15 13:37:33.474659] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cj686GcKnX 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:46.120 00:14:46.120 real 0m5.660s 00:14:46.120 user 0m8.667s 00:14:46.120 sys 0m0.981s 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:46.120 13:37:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:46.120 ************************************ 00:14:46.120 END TEST raid_write_error_test 00:14:46.120 ************************************ 00:14:46.120 13:37:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:46.120 13:37:33 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:14:46.120 13:37:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:46.120 13:37:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:14:46.120 13:37:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:46.120 13:37:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:46.120 13:37:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:46.379 ************************************ 00:14:46.379 START TEST raid_state_function_test 00:14:46.379 ************************************ 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=21292 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 21292' 00:14:46.379 Process raid pid: 21292 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 21292 /var/tmp/spdk-raid.sock 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 21292 ']' 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:46.379 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:46.379 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:46.379 [2024-07-15 13:37:33.810246] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:14:46.379 [2024-07-15 13:37:33.810289] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:46.379 [2024-07-15 13:37:33.899326] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:46.379 [2024-07-15 13:37:33.990832] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:46.637 [2024-07-15 13:37:34.055291] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:46.637 [2024-07-15 13:37:34.055316] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:47.205 [2024-07-15 13:37:34.767066] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:47.205 [2024-07-15 13:37:34.767099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:47.205 [2024-07-15 13:37:34.767106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:47.205 [2024-07-15 13:37:34.767133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:47.205 [2024-07-15 13:37:34.767138] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:47.205 [2024-07-15 13:37:34.767146] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:47.205 [2024-07-15 13:37:34.767151] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:47.205 [2024-07-15 13:37:34.767158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.205 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.463 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.463 "name": "Existed_Raid", 00:14:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.463 "strip_size_kb": 64, 00:14:47.463 "state": "configuring", 00:14:47.463 "raid_level": "raid0", 00:14:47.463 "superblock": false, 00:14:47.463 "num_base_bdevs": 4, 00:14:47.463 "num_base_bdevs_discovered": 0, 00:14:47.463 "num_base_bdevs_operational": 4, 00:14:47.463 "base_bdevs_list": [ 00:14:47.463 { 00:14:47.463 "name": "BaseBdev1", 00:14:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.463 "is_configured": false, 00:14:47.463 "data_offset": 0, 00:14:47.463 "data_size": 0 00:14:47.463 }, 00:14:47.463 { 00:14:47.463 "name": "BaseBdev2", 00:14:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.463 "is_configured": false, 00:14:47.463 "data_offset": 0, 00:14:47.463 "data_size": 0 00:14:47.463 }, 00:14:47.463 { 00:14:47.463 "name": "BaseBdev3", 00:14:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.463 "is_configured": false, 00:14:47.463 "data_offset": 0, 00:14:47.463 "data_size": 0 00:14:47.463 }, 00:14:47.463 { 00:14:47.463 "name": "BaseBdev4", 00:14:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.464 "is_configured": false, 00:14:47.464 "data_offset": 0, 00:14:47.464 "data_size": 0 00:14:47.464 } 00:14:47.464 ] 00:14:47.464 }' 00:14:47.464 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.464 13:37:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.029 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:48.030 [2024-07-15 13:37:35.617167] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:48.030 [2024-07-15 13:37:35.617193] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1beff70 name Existed_Raid, state configuring 00:14:48.030 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:48.287 [2024-07-15 13:37:35.805666] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:48.287 [2024-07-15 13:37:35.805694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:48.287 [2024-07-15 13:37:35.805703] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:48.287 [2024-07-15 13:37:35.805726] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:48.287 [2024-07-15 13:37:35.805732] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:48.287 [2024-07-15 13:37:35.805740] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:48.287 [2024-07-15 13:37:35.805745] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:48.287 [2024-07-15 13:37:35.805753] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:48.287 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:48.544 [2024-07-15 13:37:35.990701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:48.544 BaseBdev1 00:14:48.544 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:48.544 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:48.544 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.544 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:48.544 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.544 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.544 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.801 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:48.801 [ 00:14:48.801 { 00:14:48.801 "name": "BaseBdev1", 00:14:48.801 "aliases": [ 00:14:48.801 "072d55d4-601b-498b-9fbc-9bedc526530c" 00:14:48.801 ], 00:14:48.801 "product_name": "Malloc disk", 00:14:48.801 "block_size": 512, 00:14:48.801 "num_blocks": 65536, 00:14:48.801 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:48.801 "assigned_rate_limits": { 00:14:48.801 "rw_ios_per_sec": 0, 00:14:48.801 "rw_mbytes_per_sec": 0, 00:14:48.801 "r_mbytes_per_sec": 0, 00:14:48.801 "w_mbytes_per_sec": 0 00:14:48.801 }, 00:14:48.801 "claimed": true, 00:14:48.801 "claim_type": "exclusive_write", 00:14:48.801 "zoned": false, 00:14:48.801 "supported_io_types": { 00:14:48.801 "read": true, 00:14:48.801 "write": true, 00:14:48.801 "unmap": true, 00:14:48.801 "flush": true, 00:14:48.801 "reset": true, 00:14:48.801 "nvme_admin": false, 00:14:48.801 "nvme_io": false, 00:14:48.801 "nvme_io_md": false, 00:14:48.801 "write_zeroes": true, 00:14:48.801 "zcopy": true, 00:14:48.801 "get_zone_info": false, 00:14:48.801 "zone_management": false, 00:14:48.801 "zone_append": false, 00:14:48.801 "compare": false, 00:14:48.801 "compare_and_write": false, 00:14:48.801 "abort": true, 00:14:48.801 "seek_hole": false, 00:14:48.801 "seek_data": false, 00:14:48.801 "copy": true, 00:14:48.801 "nvme_iov_md": false 00:14:48.801 }, 00:14:48.801 "memory_domains": [ 00:14:48.801 { 00:14:48.801 "dma_device_id": "system", 00:14:48.801 "dma_device_type": 1 00:14:48.801 }, 00:14:48.801 { 00:14:48.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.801 "dma_device_type": 2 00:14:48.801 } 00:14:48.801 ], 00:14:48.801 "driver_specific": {} 00:14:48.801 } 00:14:48.801 ] 00:14:48.801 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:48.801 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.802 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.059 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.059 "name": "Existed_Raid", 00:14:49.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.059 "strip_size_kb": 64, 00:14:49.059 "state": "configuring", 00:14:49.059 "raid_level": "raid0", 00:14:49.059 "superblock": false, 00:14:49.059 "num_base_bdevs": 4, 00:14:49.059 "num_base_bdevs_discovered": 1, 00:14:49.059 "num_base_bdevs_operational": 4, 00:14:49.059 "base_bdevs_list": [ 00:14:49.059 { 00:14:49.059 "name": "BaseBdev1", 00:14:49.059 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:49.059 "is_configured": true, 00:14:49.059 "data_offset": 0, 00:14:49.059 "data_size": 65536 00:14:49.059 }, 00:14:49.059 { 00:14:49.059 "name": "BaseBdev2", 00:14:49.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.060 "is_configured": false, 00:14:49.060 "data_offset": 0, 00:14:49.060 "data_size": 0 00:14:49.060 }, 00:14:49.060 { 00:14:49.060 "name": "BaseBdev3", 00:14:49.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.060 "is_configured": false, 00:14:49.060 "data_offset": 0, 00:14:49.060 "data_size": 0 00:14:49.060 }, 00:14:49.060 { 00:14:49.060 "name": "BaseBdev4", 00:14:49.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.060 "is_configured": false, 00:14:49.060 "data_offset": 0, 00:14:49.060 "data_size": 0 00:14:49.060 } 00:14:49.060 ] 00:14:49.060 }' 00:14:49.060 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.060 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.625 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:49.625 [2024-07-15 13:37:37.157717] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:49.625 [2024-07-15 13:37:37.157753] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bef7e0 name Existed_Raid, state configuring 00:14:49.625 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:49.881 [2024-07-15 13:37:37.338207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:49.881 [2024-07-15 13:37:37.339240] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:49.881 [2024-07-15 13:37:37.339267] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:49.881 [2024-07-15 13:37:37.339274] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:49.881 [2024-07-15 13:37:37.339281] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:49.881 [2024-07-15 13:37:37.339287] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:49.881 [2024-07-15 13:37:37.339294] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.881 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.138 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.138 "name": "Existed_Raid", 00:14:50.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.138 "strip_size_kb": 64, 00:14:50.138 "state": "configuring", 00:14:50.138 "raid_level": "raid0", 00:14:50.138 "superblock": false, 00:14:50.138 "num_base_bdevs": 4, 00:14:50.138 "num_base_bdevs_discovered": 1, 00:14:50.138 "num_base_bdevs_operational": 4, 00:14:50.138 "base_bdevs_list": [ 00:14:50.138 { 00:14:50.138 "name": "BaseBdev1", 00:14:50.138 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:50.138 "is_configured": true, 00:14:50.138 "data_offset": 0, 00:14:50.138 "data_size": 65536 00:14:50.138 }, 00:14:50.138 { 00:14:50.138 "name": "BaseBdev2", 00:14:50.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.138 "is_configured": false, 00:14:50.138 "data_offset": 0, 00:14:50.138 "data_size": 0 00:14:50.138 }, 00:14:50.138 { 00:14:50.138 "name": "BaseBdev3", 00:14:50.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.138 "is_configured": false, 00:14:50.138 "data_offset": 0, 00:14:50.138 "data_size": 0 00:14:50.138 }, 00:14:50.138 { 00:14:50.138 "name": "BaseBdev4", 00:14:50.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.138 "is_configured": false, 00:14:50.138 "data_offset": 0, 00:14:50.138 "data_size": 0 00:14:50.138 } 00:14:50.138 ] 00:14:50.138 }' 00:14:50.138 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.138 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:50.703 [2024-07-15 13:37:38.195154] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:50.703 BaseBdev2 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:50.703 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:50.961 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:51.219 [ 00:14:51.220 { 00:14:51.220 "name": "BaseBdev2", 00:14:51.220 "aliases": [ 00:14:51.220 "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25" 00:14:51.220 ], 00:14:51.220 "product_name": "Malloc disk", 00:14:51.220 "block_size": 512, 00:14:51.220 "num_blocks": 65536, 00:14:51.220 "uuid": "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25", 00:14:51.220 "assigned_rate_limits": { 00:14:51.220 "rw_ios_per_sec": 0, 00:14:51.220 "rw_mbytes_per_sec": 0, 00:14:51.220 "r_mbytes_per_sec": 0, 00:14:51.220 "w_mbytes_per_sec": 0 00:14:51.220 }, 00:14:51.220 "claimed": true, 00:14:51.220 "claim_type": "exclusive_write", 00:14:51.220 "zoned": false, 00:14:51.220 "supported_io_types": { 00:14:51.220 "read": true, 00:14:51.220 "write": true, 00:14:51.220 "unmap": true, 00:14:51.220 "flush": true, 00:14:51.220 "reset": true, 00:14:51.220 "nvme_admin": false, 00:14:51.220 "nvme_io": false, 00:14:51.220 "nvme_io_md": false, 00:14:51.220 "write_zeroes": true, 00:14:51.220 "zcopy": true, 00:14:51.220 "get_zone_info": false, 00:14:51.220 "zone_management": false, 00:14:51.220 "zone_append": false, 00:14:51.220 "compare": false, 00:14:51.220 "compare_and_write": false, 00:14:51.220 "abort": true, 00:14:51.220 "seek_hole": false, 00:14:51.220 "seek_data": false, 00:14:51.220 "copy": true, 00:14:51.220 "nvme_iov_md": false 00:14:51.220 }, 00:14:51.220 "memory_domains": [ 00:14:51.220 { 00:14:51.220 "dma_device_id": "system", 00:14:51.220 "dma_device_type": 1 00:14:51.220 }, 00:14:51.220 { 00:14:51.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.220 "dma_device_type": 2 00:14:51.220 } 00:14:51.220 ], 00:14:51.220 "driver_specific": {} 00:14:51.220 } 00:14:51.220 ] 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.220 "name": "Existed_Raid", 00:14:51.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.220 "strip_size_kb": 64, 00:14:51.220 "state": "configuring", 00:14:51.220 "raid_level": "raid0", 00:14:51.220 "superblock": false, 00:14:51.220 "num_base_bdevs": 4, 00:14:51.220 "num_base_bdevs_discovered": 2, 00:14:51.220 "num_base_bdevs_operational": 4, 00:14:51.220 "base_bdevs_list": [ 00:14:51.220 { 00:14:51.220 "name": "BaseBdev1", 00:14:51.220 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:51.220 "is_configured": true, 00:14:51.220 "data_offset": 0, 00:14:51.220 "data_size": 65536 00:14:51.220 }, 00:14:51.220 { 00:14:51.220 "name": "BaseBdev2", 00:14:51.220 "uuid": "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25", 00:14:51.220 "is_configured": true, 00:14:51.220 "data_offset": 0, 00:14:51.220 "data_size": 65536 00:14:51.220 }, 00:14:51.220 { 00:14:51.220 "name": "BaseBdev3", 00:14:51.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.220 "is_configured": false, 00:14:51.220 "data_offset": 0, 00:14:51.220 "data_size": 0 00:14:51.220 }, 00:14:51.220 { 00:14:51.220 "name": "BaseBdev4", 00:14:51.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.220 "is_configured": false, 00:14:51.220 "data_offset": 0, 00:14:51.220 "data_size": 0 00:14:51.220 } 00:14:51.220 ] 00:14:51.220 }' 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.220 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.786 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:52.045 [2024-07-15 13:37:39.450399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:52.045 BaseBdev3 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:52.045 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:52.304 [ 00:14:52.304 { 00:14:52.304 "name": "BaseBdev3", 00:14:52.304 "aliases": [ 00:14:52.304 "d904e415-c5f3-49f3-afee-b24692fbc047" 00:14:52.304 ], 00:14:52.304 "product_name": "Malloc disk", 00:14:52.304 "block_size": 512, 00:14:52.304 "num_blocks": 65536, 00:14:52.304 "uuid": "d904e415-c5f3-49f3-afee-b24692fbc047", 00:14:52.304 "assigned_rate_limits": { 00:14:52.304 "rw_ios_per_sec": 0, 00:14:52.304 "rw_mbytes_per_sec": 0, 00:14:52.304 "r_mbytes_per_sec": 0, 00:14:52.304 "w_mbytes_per_sec": 0 00:14:52.304 }, 00:14:52.304 "claimed": true, 00:14:52.304 "claim_type": "exclusive_write", 00:14:52.304 "zoned": false, 00:14:52.304 "supported_io_types": { 00:14:52.304 "read": true, 00:14:52.304 "write": true, 00:14:52.304 "unmap": true, 00:14:52.304 "flush": true, 00:14:52.304 "reset": true, 00:14:52.304 "nvme_admin": false, 00:14:52.304 "nvme_io": false, 00:14:52.304 "nvme_io_md": false, 00:14:52.304 "write_zeroes": true, 00:14:52.304 "zcopy": true, 00:14:52.304 "get_zone_info": false, 00:14:52.304 "zone_management": false, 00:14:52.304 "zone_append": false, 00:14:52.304 "compare": false, 00:14:52.304 "compare_and_write": false, 00:14:52.304 "abort": true, 00:14:52.304 "seek_hole": false, 00:14:52.304 "seek_data": false, 00:14:52.304 "copy": true, 00:14:52.304 "nvme_iov_md": false 00:14:52.304 }, 00:14:52.304 "memory_domains": [ 00:14:52.304 { 00:14:52.304 "dma_device_id": "system", 00:14:52.304 "dma_device_type": 1 00:14:52.304 }, 00:14:52.304 { 00:14:52.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.304 "dma_device_type": 2 00:14:52.304 } 00:14:52.304 ], 00:14:52.304 "driver_specific": {} 00:14:52.304 } 00:14:52.304 ] 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.304 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.562 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.562 "name": "Existed_Raid", 00:14:52.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.562 "strip_size_kb": 64, 00:14:52.562 "state": "configuring", 00:14:52.562 "raid_level": "raid0", 00:14:52.562 "superblock": false, 00:14:52.562 "num_base_bdevs": 4, 00:14:52.562 "num_base_bdevs_discovered": 3, 00:14:52.562 "num_base_bdevs_operational": 4, 00:14:52.563 "base_bdevs_list": [ 00:14:52.563 { 00:14:52.563 "name": "BaseBdev1", 00:14:52.563 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:52.563 "is_configured": true, 00:14:52.563 "data_offset": 0, 00:14:52.563 "data_size": 65536 00:14:52.563 }, 00:14:52.563 { 00:14:52.563 "name": "BaseBdev2", 00:14:52.563 "uuid": "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25", 00:14:52.563 "is_configured": true, 00:14:52.563 "data_offset": 0, 00:14:52.563 "data_size": 65536 00:14:52.563 }, 00:14:52.563 { 00:14:52.563 "name": "BaseBdev3", 00:14:52.563 "uuid": "d904e415-c5f3-49f3-afee-b24692fbc047", 00:14:52.563 "is_configured": true, 00:14:52.563 "data_offset": 0, 00:14:52.563 "data_size": 65536 00:14:52.563 }, 00:14:52.563 { 00:14:52.563 "name": "BaseBdev4", 00:14:52.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.563 "is_configured": false, 00:14:52.563 "data_offset": 0, 00:14:52.563 "data_size": 0 00:14:52.563 } 00:14:52.563 ] 00:14:52.563 }' 00:14:52.563 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.563 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:53.129 [2024-07-15 13:37:40.660372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:53.129 [2024-07-15 13:37:40.660403] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bf0840 00:14:53.129 [2024-07-15 13:37:40.660409] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:53.129 [2024-07-15 13:37:40.660562] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf0480 00:14:53.129 [2024-07-15 13:37:40.660649] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bf0840 00:14:53.129 [2024-07-15 13:37:40.660656] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bf0840 00:14:53.129 [2024-07-15 13:37:40.660778] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:53.129 BaseBdev4 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:53.129 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:53.387 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:53.387 [ 00:14:53.387 { 00:14:53.387 "name": "BaseBdev4", 00:14:53.387 "aliases": [ 00:14:53.387 "16a6f8e8-70ab-40a9-bccb-5f26d02bc73d" 00:14:53.387 ], 00:14:53.387 "product_name": "Malloc disk", 00:14:53.387 "block_size": 512, 00:14:53.387 "num_blocks": 65536, 00:14:53.387 "uuid": "16a6f8e8-70ab-40a9-bccb-5f26d02bc73d", 00:14:53.387 "assigned_rate_limits": { 00:14:53.387 "rw_ios_per_sec": 0, 00:14:53.387 "rw_mbytes_per_sec": 0, 00:14:53.387 "r_mbytes_per_sec": 0, 00:14:53.387 "w_mbytes_per_sec": 0 00:14:53.387 }, 00:14:53.387 "claimed": true, 00:14:53.387 "claim_type": "exclusive_write", 00:14:53.387 "zoned": false, 00:14:53.387 "supported_io_types": { 00:14:53.387 "read": true, 00:14:53.387 "write": true, 00:14:53.387 "unmap": true, 00:14:53.387 "flush": true, 00:14:53.387 "reset": true, 00:14:53.387 "nvme_admin": false, 00:14:53.387 "nvme_io": false, 00:14:53.387 "nvme_io_md": false, 00:14:53.387 "write_zeroes": true, 00:14:53.387 "zcopy": true, 00:14:53.387 "get_zone_info": false, 00:14:53.387 "zone_management": false, 00:14:53.387 "zone_append": false, 00:14:53.387 "compare": false, 00:14:53.387 "compare_and_write": false, 00:14:53.388 "abort": true, 00:14:53.388 "seek_hole": false, 00:14:53.388 "seek_data": false, 00:14:53.388 "copy": true, 00:14:53.388 "nvme_iov_md": false 00:14:53.388 }, 00:14:53.388 "memory_domains": [ 00:14:53.388 { 00:14:53.388 "dma_device_id": "system", 00:14:53.388 "dma_device_type": 1 00:14:53.388 }, 00:14:53.388 { 00:14:53.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.388 "dma_device_type": 2 00:14:53.388 } 00:14:53.388 ], 00:14:53.388 "driver_specific": {} 00:14:53.388 } 00:14:53.388 ] 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.646 "name": "Existed_Raid", 00:14:53.646 "uuid": "08b1b960-b8f1-4b5a-bb95-99288ee2fbee", 00:14:53.646 "strip_size_kb": 64, 00:14:53.646 "state": "online", 00:14:53.646 "raid_level": "raid0", 00:14:53.646 "superblock": false, 00:14:53.646 "num_base_bdevs": 4, 00:14:53.646 "num_base_bdevs_discovered": 4, 00:14:53.646 "num_base_bdevs_operational": 4, 00:14:53.646 "base_bdevs_list": [ 00:14:53.646 { 00:14:53.646 "name": "BaseBdev1", 00:14:53.646 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:53.646 "is_configured": true, 00:14:53.646 "data_offset": 0, 00:14:53.646 "data_size": 65536 00:14:53.646 }, 00:14:53.646 { 00:14:53.646 "name": "BaseBdev2", 00:14:53.646 "uuid": "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25", 00:14:53.646 "is_configured": true, 00:14:53.646 "data_offset": 0, 00:14:53.646 "data_size": 65536 00:14:53.646 }, 00:14:53.646 { 00:14:53.646 "name": "BaseBdev3", 00:14:53.646 "uuid": "d904e415-c5f3-49f3-afee-b24692fbc047", 00:14:53.646 "is_configured": true, 00:14:53.646 "data_offset": 0, 00:14:53.646 "data_size": 65536 00:14:53.646 }, 00:14:53.646 { 00:14:53.646 "name": "BaseBdev4", 00:14:53.646 "uuid": "16a6f8e8-70ab-40a9-bccb-5f26d02bc73d", 00:14:53.646 "is_configured": true, 00:14:53.646 "data_offset": 0, 00:14:53.646 "data_size": 65536 00:14:53.646 } 00:14:53.646 ] 00:14:53.646 }' 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.646 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:54.211 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:54.470 [2024-07-15 13:37:41.867703] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:54.470 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:54.470 "name": "Existed_Raid", 00:14:54.470 "aliases": [ 00:14:54.470 "08b1b960-b8f1-4b5a-bb95-99288ee2fbee" 00:14:54.470 ], 00:14:54.470 "product_name": "Raid Volume", 00:14:54.470 "block_size": 512, 00:14:54.470 "num_blocks": 262144, 00:14:54.470 "uuid": "08b1b960-b8f1-4b5a-bb95-99288ee2fbee", 00:14:54.470 "assigned_rate_limits": { 00:14:54.470 "rw_ios_per_sec": 0, 00:14:54.470 "rw_mbytes_per_sec": 0, 00:14:54.470 "r_mbytes_per_sec": 0, 00:14:54.470 "w_mbytes_per_sec": 0 00:14:54.470 }, 00:14:54.470 "claimed": false, 00:14:54.470 "zoned": false, 00:14:54.470 "supported_io_types": { 00:14:54.470 "read": true, 00:14:54.470 "write": true, 00:14:54.470 "unmap": true, 00:14:54.470 "flush": true, 00:14:54.470 "reset": true, 00:14:54.470 "nvme_admin": false, 00:14:54.470 "nvme_io": false, 00:14:54.470 "nvme_io_md": false, 00:14:54.470 "write_zeroes": true, 00:14:54.470 "zcopy": false, 00:14:54.470 "get_zone_info": false, 00:14:54.470 "zone_management": false, 00:14:54.470 "zone_append": false, 00:14:54.470 "compare": false, 00:14:54.470 "compare_and_write": false, 00:14:54.470 "abort": false, 00:14:54.470 "seek_hole": false, 00:14:54.470 "seek_data": false, 00:14:54.470 "copy": false, 00:14:54.470 "nvme_iov_md": false 00:14:54.470 }, 00:14:54.470 "memory_domains": [ 00:14:54.470 { 00:14:54.470 "dma_device_id": "system", 00:14:54.470 "dma_device_type": 1 00:14:54.470 }, 00:14:54.470 { 00:14:54.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.470 "dma_device_type": 2 00:14:54.470 }, 00:14:54.470 { 00:14:54.470 "dma_device_id": "system", 00:14:54.470 "dma_device_type": 1 00:14:54.470 }, 00:14:54.470 { 00:14:54.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.470 "dma_device_type": 2 00:14:54.470 }, 00:14:54.470 { 00:14:54.470 "dma_device_id": "system", 00:14:54.470 "dma_device_type": 1 00:14:54.471 }, 00:14:54.471 { 00:14:54.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.471 "dma_device_type": 2 00:14:54.471 }, 00:14:54.471 { 00:14:54.471 "dma_device_id": "system", 00:14:54.471 "dma_device_type": 1 00:14:54.471 }, 00:14:54.471 { 00:14:54.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.471 "dma_device_type": 2 00:14:54.471 } 00:14:54.471 ], 00:14:54.471 "driver_specific": { 00:14:54.471 "raid": { 00:14:54.471 "uuid": "08b1b960-b8f1-4b5a-bb95-99288ee2fbee", 00:14:54.471 "strip_size_kb": 64, 00:14:54.471 "state": "online", 00:14:54.471 "raid_level": "raid0", 00:14:54.471 "superblock": false, 00:14:54.471 "num_base_bdevs": 4, 00:14:54.471 "num_base_bdevs_discovered": 4, 00:14:54.471 "num_base_bdevs_operational": 4, 00:14:54.471 "base_bdevs_list": [ 00:14:54.471 { 00:14:54.471 "name": "BaseBdev1", 00:14:54.471 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:54.471 "is_configured": true, 00:14:54.471 "data_offset": 0, 00:14:54.471 "data_size": 65536 00:14:54.471 }, 00:14:54.471 { 00:14:54.471 "name": "BaseBdev2", 00:14:54.471 "uuid": "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25", 00:14:54.471 "is_configured": true, 00:14:54.471 "data_offset": 0, 00:14:54.471 "data_size": 65536 00:14:54.471 }, 00:14:54.471 { 00:14:54.471 "name": "BaseBdev3", 00:14:54.471 "uuid": "d904e415-c5f3-49f3-afee-b24692fbc047", 00:14:54.471 "is_configured": true, 00:14:54.471 "data_offset": 0, 00:14:54.471 "data_size": 65536 00:14:54.471 }, 00:14:54.471 { 00:14:54.471 "name": "BaseBdev4", 00:14:54.471 "uuid": "16a6f8e8-70ab-40a9-bccb-5f26d02bc73d", 00:14:54.471 "is_configured": true, 00:14:54.471 "data_offset": 0, 00:14:54.471 "data_size": 65536 00:14:54.471 } 00:14:54.471 ] 00:14:54.471 } 00:14:54.471 } 00:14:54.471 }' 00:14:54.471 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:54.471 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:54.471 BaseBdev2 00:14:54.471 BaseBdev3 00:14:54.471 BaseBdev4' 00:14:54.471 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:54.471 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:54.471 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:54.730 "name": "BaseBdev1", 00:14:54.730 "aliases": [ 00:14:54.730 "072d55d4-601b-498b-9fbc-9bedc526530c" 00:14:54.730 ], 00:14:54.730 "product_name": "Malloc disk", 00:14:54.730 "block_size": 512, 00:14:54.730 "num_blocks": 65536, 00:14:54.730 "uuid": "072d55d4-601b-498b-9fbc-9bedc526530c", 00:14:54.730 "assigned_rate_limits": { 00:14:54.730 "rw_ios_per_sec": 0, 00:14:54.730 "rw_mbytes_per_sec": 0, 00:14:54.730 "r_mbytes_per_sec": 0, 00:14:54.730 "w_mbytes_per_sec": 0 00:14:54.730 }, 00:14:54.730 "claimed": true, 00:14:54.730 "claim_type": "exclusive_write", 00:14:54.730 "zoned": false, 00:14:54.730 "supported_io_types": { 00:14:54.730 "read": true, 00:14:54.730 "write": true, 00:14:54.730 "unmap": true, 00:14:54.730 "flush": true, 00:14:54.730 "reset": true, 00:14:54.730 "nvme_admin": false, 00:14:54.730 "nvme_io": false, 00:14:54.730 "nvme_io_md": false, 00:14:54.730 "write_zeroes": true, 00:14:54.730 "zcopy": true, 00:14:54.730 "get_zone_info": false, 00:14:54.730 "zone_management": false, 00:14:54.730 "zone_append": false, 00:14:54.730 "compare": false, 00:14:54.730 "compare_and_write": false, 00:14:54.730 "abort": true, 00:14:54.730 "seek_hole": false, 00:14:54.730 "seek_data": false, 00:14:54.730 "copy": true, 00:14:54.730 "nvme_iov_md": false 00:14:54.730 }, 00:14:54.730 "memory_domains": [ 00:14:54.730 { 00:14:54.730 "dma_device_id": "system", 00:14:54.730 "dma_device_type": 1 00:14:54.730 }, 00:14:54.730 { 00:14:54.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.730 "dma_device_type": 2 00:14:54.730 } 00:14:54.730 ], 00:14:54.730 "driver_specific": {} 00:14:54.730 }' 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:54.730 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:54.990 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:54.990 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:54.990 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:54.990 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:54.990 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:54.990 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:54.990 "name": "BaseBdev2", 00:14:54.990 "aliases": [ 00:14:54.990 "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25" 00:14:54.990 ], 00:14:54.990 "product_name": "Malloc disk", 00:14:54.990 "block_size": 512, 00:14:54.990 "num_blocks": 65536, 00:14:54.990 "uuid": "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25", 00:14:54.990 "assigned_rate_limits": { 00:14:54.990 "rw_ios_per_sec": 0, 00:14:54.990 "rw_mbytes_per_sec": 0, 00:14:54.990 "r_mbytes_per_sec": 0, 00:14:54.990 "w_mbytes_per_sec": 0 00:14:54.990 }, 00:14:54.990 "claimed": true, 00:14:54.990 "claim_type": "exclusive_write", 00:14:54.990 "zoned": false, 00:14:54.990 "supported_io_types": { 00:14:54.990 "read": true, 00:14:54.990 "write": true, 00:14:54.990 "unmap": true, 00:14:54.990 "flush": true, 00:14:54.990 "reset": true, 00:14:54.990 "nvme_admin": false, 00:14:54.990 "nvme_io": false, 00:14:54.990 "nvme_io_md": false, 00:14:54.990 "write_zeroes": true, 00:14:54.990 "zcopy": true, 00:14:54.990 "get_zone_info": false, 00:14:54.990 "zone_management": false, 00:14:54.990 "zone_append": false, 00:14:54.990 "compare": false, 00:14:54.990 "compare_and_write": false, 00:14:54.990 "abort": true, 00:14:54.990 "seek_hole": false, 00:14:54.990 "seek_data": false, 00:14:54.990 "copy": true, 00:14:54.990 "nvme_iov_md": false 00:14:54.990 }, 00:14:54.990 "memory_domains": [ 00:14:54.990 { 00:14:54.990 "dma_device_id": "system", 00:14:54.990 "dma_device_type": 1 00:14:54.990 }, 00:14:54.990 { 00:14:54.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.990 "dma_device_type": 2 00:14:54.990 } 00:14:54.990 ], 00:14:54.990 "driver_specific": {} 00:14:54.990 }' 00:14:54.990 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:55.249 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.508 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.508 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:55.508 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:55.508 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:55.508 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:55.508 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:55.508 "name": "BaseBdev3", 00:14:55.508 "aliases": [ 00:14:55.508 "d904e415-c5f3-49f3-afee-b24692fbc047" 00:14:55.508 ], 00:14:55.508 "product_name": "Malloc disk", 00:14:55.508 "block_size": 512, 00:14:55.508 "num_blocks": 65536, 00:14:55.508 "uuid": "d904e415-c5f3-49f3-afee-b24692fbc047", 00:14:55.508 "assigned_rate_limits": { 00:14:55.508 "rw_ios_per_sec": 0, 00:14:55.508 "rw_mbytes_per_sec": 0, 00:14:55.508 "r_mbytes_per_sec": 0, 00:14:55.508 "w_mbytes_per_sec": 0 00:14:55.508 }, 00:14:55.508 "claimed": true, 00:14:55.508 "claim_type": "exclusive_write", 00:14:55.508 "zoned": false, 00:14:55.508 "supported_io_types": { 00:14:55.508 "read": true, 00:14:55.508 "write": true, 00:14:55.508 "unmap": true, 00:14:55.508 "flush": true, 00:14:55.508 "reset": true, 00:14:55.508 "nvme_admin": false, 00:14:55.508 "nvme_io": false, 00:14:55.508 "nvme_io_md": false, 00:14:55.508 "write_zeroes": true, 00:14:55.508 "zcopy": true, 00:14:55.508 "get_zone_info": false, 00:14:55.508 "zone_management": false, 00:14:55.508 "zone_append": false, 00:14:55.508 "compare": false, 00:14:55.508 "compare_and_write": false, 00:14:55.508 "abort": true, 00:14:55.508 "seek_hole": false, 00:14:55.508 "seek_data": false, 00:14:55.508 "copy": true, 00:14:55.508 "nvme_iov_md": false 00:14:55.508 }, 00:14:55.508 "memory_domains": [ 00:14:55.508 { 00:14:55.508 "dma_device_id": "system", 00:14:55.508 "dma_device_type": 1 00:14:55.508 }, 00:14:55.508 { 00:14:55.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.508 "dma_device_type": 2 00:14:55.508 } 00:14:55.508 ], 00:14:55.508 "driver_specific": {} 00:14:55.508 }' 00:14:55.508 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.769 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.029 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.029 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.029 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:56.029 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.029 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.029 "name": "BaseBdev4", 00:14:56.029 "aliases": [ 00:14:56.029 "16a6f8e8-70ab-40a9-bccb-5f26d02bc73d" 00:14:56.029 ], 00:14:56.029 "product_name": "Malloc disk", 00:14:56.029 "block_size": 512, 00:14:56.029 "num_blocks": 65536, 00:14:56.029 "uuid": "16a6f8e8-70ab-40a9-bccb-5f26d02bc73d", 00:14:56.029 "assigned_rate_limits": { 00:14:56.029 "rw_ios_per_sec": 0, 00:14:56.029 "rw_mbytes_per_sec": 0, 00:14:56.029 "r_mbytes_per_sec": 0, 00:14:56.030 "w_mbytes_per_sec": 0 00:14:56.030 }, 00:14:56.030 "claimed": true, 00:14:56.030 "claim_type": "exclusive_write", 00:14:56.030 "zoned": false, 00:14:56.030 "supported_io_types": { 00:14:56.030 "read": true, 00:14:56.030 "write": true, 00:14:56.030 "unmap": true, 00:14:56.030 "flush": true, 00:14:56.030 "reset": true, 00:14:56.030 "nvme_admin": false, 00:14:56.030 "nvme_io": false, 00:14:56.030 "nvme_io_md": false, 00:14:56.030 "write_zeroes": true, 00:14:56.030 "zcopy": true, 00:14:56.030 "get_zone_info": false, 00:14:56.030 "zone_management": false, 00:14:56.030 "zone_append": false, 00:14:56.030 "compare": false, 00:14:56.030 "compare_and_write": false, 00:14:56.030 "abort": true, 00:14:56.030 "seek_hole": false, 00:14:56.030 "seek_data": false, 00:14:56.030 "copy": true, 00:14:56.030 "nvme_iov_md": false 00:14:56.030 }, 00:14:56.030 "memory_domains": [ 00:14:56.030 { 00:14:56.030 "dma_device_id": "system", 00:14:56.030 "dma_device_type": 1 00:14:56.030 }, 00:14:56.030 { 00:14:56.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.030 "dma_device_type": 2 00:14:56.030 } 00:14:56.030 ], 00:14:56.030 "driver_specific": {} 00:14:56.030 }' 00:14:56.030 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.030 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.289 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:56.548 [2024-07-15 13:37:44.029105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:56.548 [2024-07-15 13:37:44.029128] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:56.548 [2024-07-15 13:37:44.029162] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.548 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.807 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.807 "name": "Existed_Raid", 00:14:56.807 "uuid": "08b1b960-b8f1-4b5a-bb95-99288ee2fbee", 00:14:56.807 "strip_size_kb": 64, 00:14:56.807 "state": "offline", 00:14:56.807 "raid_level": "raid0", 00:14:56.807 "superblock": false, 00:14:56.807 "num_base_bdevs": 4, 00:14:56.807 "num_base_bdevs_discovered": 3, 00:14:56.807 "num_base_bdevs_operational": 3, 00:14:56.807 "base_bdevs_list": [ 00:14:56.807 { 00:14:56.807 "name": null, 00:14:56.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.807 "is_configured": false, 00:14:56.807 "data_offset": 0, 00:14:56.807 "data_size": 65536 00:14:56.807 }, 00:14:56.807 { 00:14:56.807 "name": "BaseBdev2", 00:14:56.807 "uuid": "a3e0b415-d552-4d8b-b4d2-dfb9e5b5fb25", 00:14:56.807 "is_configured": true, 00:14:56.807 "data_offset": 0, 00:14:56.807 "data_size": 65536 00:14:56.807 }, 00:14:56.807 { 00:14:56.807 "name": "BaseBdev3", 00:14:56.807 "uuid": "d904e415-c5f3-49f3-afee-b24692fbc047", 00:14:56.807 "is_configured": true, 00:14:56.807 "data_offset": 0, 00:14:56.807 "data_size": 65536 00:14:56.807 }, 00:14:56.807 { 00:14:56.807 "name": "BaseBdev4", 00:14:56.807 "uuid": "16a6f8e8-70ab-40a9-bccb-5f26d02bc73d", 00:14:56.807 "is_configured": true, 00:14:56.807 "data_offset": 0, 00:14:56.807 "data_size": 65536 00:14:56.807 } 00:14:56.807 ] 00:14:56.807 }' 00:14:56.807 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.807 13:37:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.375 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:57.375 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:57.375 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.375 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:57.375 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:57.375 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:57.375 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:57.635 [2024-07-15 13:37:45.061278] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:57.635 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:57.635 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:57.635 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.635 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:57.893 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:57.894 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:57.894 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:57.894 [2024-07-15 13:37:45.424221] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:57.894 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:57.894 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:57.894 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.894 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:58.152 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:58.152 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:58.152 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:58.411 [2024-07-15 13:37:45.771053] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:58.411 [2024-07-15 13:37:45.771086] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bf0840 name Existed_Raid, state offline 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:58.411 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:58.669 BaseBdev2 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:58.669 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:58.928 [ 00:14:58.928 { 00:14:58.928 "name": "BaseBdev2", 00:14:58.928 "aliases": [ 00:14:58.928 "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca" 00:14:58.928 ], 00:14:58.928 "product_name": "Malloc disk", 00:14:58.928 "block_size": 512, 00:14:58.928 "num_blocks": 65536, 00:14:58.928 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:14:58.928 "assigned_rate_limits": { 00:14:58.928 "rw_ios_per_sec": 0, 00:14:58.928 "rw_mbytes_per_sec": 0, 00:14:58.928 "r_mbytes_per_sec": 0, 00:14:58.928 "w_mbytes_per_sec": 0 00:14:58.928 }, 00:14:58.928 "claimed": false, 00:14:58.928 "zoned": false, 00:14:58.928 "supported_io_types": { 00:14:58.928 "read": true, 00:14:58.928 "write": true, 00:14:58.928 "unmap": true, 00:14:58.928 "flush": true, 00:14:58.928 "reset": true, 00:14:58.928 "nvme_admin": false, 00:14:58.928 "nvme_io": false, 00:14:58.928 "nvme_io_md": false, 00:14:58.928 "write_zeroes": true, 00:14:58.928 "zcopy": true, 00:14:58.928 "get_zone_info": false, 00:14:58.928 "zone_management": false, 00:14:58.928 "zone_append": false, 00:14:58.928 "compare": false, 00:14:58.928 "compare_and_write": false, 00:14:58.928 "abort": true, 00:14:58.928 "seek_hole": false, 00:14:58.928 "seek_data": false, 00:14:58.928 "copy": true, 00:14:58.928 "nvme_iov_md": false 00:14:58.928 }, 00:14:58.928 "memory_domains": [ 00:14:58.928 { 00:14:58.928 "dma_device_id": "system", 00:14:58.928 "dma_device_type": 1 00:14:58.928 }, 00:14:58.928 { 00:14:58.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.928 "dma_device_type": 2 00:14:58.928 } 00:14:58.928 ], 00:14:58.928 "driver_specific": {} 00:14:58.928 } 00:14:58.928 ] 00:14:58.928 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:58.928 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:58.928 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:58.928 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:59.188 BaseBdev3 00:14:59.188 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:59.188 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:59.188 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:59.188 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:59.188 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:59.188 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:59.188 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:59.447 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:59.447 [ 00:14:59.447 { 00:14:59.447 "name": "BaseBdev3", 00:14:59.447 "aliases": [ 00:14:59.447 "70c918a7-3ba0-45ff-8a16-1ad1c574b731" 00:14:59.447 ], 00:14:59.447 "product_name": "Malloc disk", 00:14:59.447 "block_size": 512, 00:14:59.447 "num_blocks": 65536, 00:14:59.447 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:14:59.447 "assigned_rate_limits": { 00:14:59.447 "rw_ios_per_sec": 0, 00:14:59.447 "rw_mbytes_per_sec": 0, 00:14:59.447 "r_mbytes_per_sec": 0, 00:14:59.447 "w_mbytes_per_sec": 0 00:14:59.447 }, 00:14:59.447 "claimed": false, 00:14:59.448 "zoned": false, 00:14:59.448 "supported_io_types": { 00:14:59.448 "read": true, 00:14:59.448 "write": true, 00:14:59.448 "unmap": true, 00:14:59.448 "flush": true, 00:14:59.448 "reset": true, 00:14:59.448 "nvme_admin": false, 00:14:59.448 "nvme_io": false, 00:14:59.448 "nvme_io_md": false, 00:14:59.448 "write_zeroes": true, 00:14:59.448 "zcopy": true, 00:14:59.448 "get_zone_info": false, 00:14:59.448 "zone_management": false, 00:14:59.448 "zone_append": false, 00:14:59.448 "compare": false, 00:14:59.448 "compare_and_write": false, 00:14:59.448 "abort": true, 00:14:59.448 "seek_hole": false, 00:14:59.448 "seek_data": false, 00:14:59.448 "copy": true, 00:14:59.448 "nvme_iov_md": false 00:14:59.448 }, 00:14:59.448 "memory_domains": [ 00:14:59.448 { 00:14:59.448 "dma_device_id": "system", 00:14:59.448 "dma_device_type": 1 00:14:59.448 }, 00:14:59.448 { 00:14:59.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.448 "dma_device_type": 2 00:14:59.448 } 00:14:59.448 ], 00:14:59.448 "driver_specific": {} 00:14:59.448 } 00:14:59.448 ] 00:14:59.448 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:59.448 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:59.448 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:59.448 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:59.707 BaseBdev4 00:14:59.707 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:14:59.707 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:59.707 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:59.707 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:59.707 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:59.707 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:59.707 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:59.966 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:59.966 [ 00:14:59.966 { 00:14:59.966 "name": "BaseBdev4", 00:14:59.966 "aliases": [ 00:14:59.966 "171463aa-31b3-4e64-912e-26bfec6a1fc7" 00:14:59.966 ], 00:14:59.966 "product_name": "Malloc disk", 00:14:59.966 "block_size": 512, 00:14:59.966 "num_blocks": 65536, 00:14:59.966 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:14:59.966 "assigned_rate_limits": { 00:14:59.966 "rw_ios_per_sec": 0, 00:14:59.966 "rw_mbytes_per_sec": 0, 00:14:59.966 "r_mbytes_per_sec": 0, 00:14:59.966 "w_mbytes_per_sec": 0 00:14:59.966 }, 00:14:59.966 "claimed": false, 00:14:59.966 "zoned": false, 00:14:59.966 "supported_io_types": { 00:14:59.966 "read": true, 00:14:59.966 "write": true, 00:14:59.966 "unmap": true, 00:14:59.966 "flush": true, 00:14:59.966 "reset": true, 00:14:59.966 "nvme_admin": false, 00:14:59.966 "nvme_io": false, 00:14:59.966 "nvme_io_md": false, 00:14:59.966 "write_zeroes": true, 00:14:59.966 "zcopy": true, 00:14:59.966 "get_zone_info": false, 00:14:59.966 "zone_management": false, 00:14:59.966 "zone_append": false, 00:14:59.966 "compare": false, 00:14:59.966 "compare_and_write": false, 00:14:59.966 "abort": true, 00:14:59.966 "seek_hole": false, 00:14:59.966 "seek_data": false, 00:14:59.966 "copy": true, 00:14:59.966 "nvme_iov_md": false 00:14:59.966 }, 00:14:59.966 "memory_domains": [ 00:14:59.966 { 00:14:59.966 "dma_device_id": "system", 00:14:59.966 "dma_device_type": 1 00:14:59.966 }, 00:14:59.966 { 00:14:59.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.966 "dma_device_type": 2 00:14:59.966 } 00:14:59.966 ], 00:14:59.966 "driver_specific": {} 00:14:59.966 } 00:14:59.966 ] 00:14:59.966 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:59.966 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:59.966 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:59.966 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:00.226 [2024-07-15 13:37:47.640820] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:00.226 [2024-07-15 13:37:47.640855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:00.226 [2024-07-15 13:37:47.640869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:00.226 [2024-07-15 13:37:47.641871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:00.226 [2024-07-15 13:37:47.641901] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.226 "name": "Existed_Raid", 00:15:00.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.226 "strip_size_kb": 64, 00:15:00.226 "state": "configuring", 00:15:00.226 "raid_level": "raid0", 00:15:00.226 "superblock": false, 00:15:00.226 "num_base_bdevs": 4, 00:15:00.226 "num_base_bdevs_discovered": 3, 00:15:00.226 "num_base_bdevs_operational": 4, 00:15:00.226 "base_bdevs_list": [ 00:15:00.226 { 00:15:00.226 "name": "BaseBdev1", 00:15:00.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.226 "is_configured": false, 00:15:00.226 "data_offset": 0, 00:15:00.226 "data_size": 0 00:15:00.226 }, 00:15:00.226 { 00:15:00.226 "name": "BaseBdev2", 00:15:00.226 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:00.226 "is_configured": true, 00:15:00.226 "data_offset": 0, 00:15:00.226 "data_size": 65536 00:15:00.226 }, 00:15:00.226 { 00:15:00.226 "name": "BaseBdev3", 00:15:00.226 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:00.226 "is_configured": true, 00:15:00.226 "data_offset": 0, 00:15:00.226 "data_size": 65536 00:15:00.226 }, 00:15:00.226 { 00:15:00.226 "name": "BaseBdev4", 00:15:00.226 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:00.226 "is_configured": true, 00:15:00.226 "data_offset": 0, 00:15:00.226 "data_size": 65536 00:15:00.226 } 00:15:00.226 ] 00:15:00.226 }' 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.226 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.792 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:01.051 [2024-07-15 13:37:48.462921] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:01.051 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:01.051 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.051 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.052 "name": "Existed_Raid", 00:15:01.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.052 "strip_size_kb": 64, 00:15:01.052 "state": "configuring", 00:15:01.052 "raid_level": "raid0", 00:15:01.052 "superblock": false, 00:15:01.052 "num_base_bdevs": 4, 00:15:01.052 "num_base_bdevs_discovered": 2, 00:15:01.052 "num_base_bdevs_operational": 4, 00:15:01.052 "base_bdevs_list": [ 00:15:01.052 { 00:15:01.052 "name": "BaseBdev1", 00:15:01.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.052 "is_configured": false, 00:15:01.052 "data_offset": 0, 00:15:01.052 "data_size": 0 00:15:01.052 }, 00:15:01.052 { 00:15:01.052 "name": null, 00:15:01.052 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:01.052 "is_configured": false, 00:15:01.052 "data_offset": 0, 00:15:01.052 "data_size": 65536 00:15:01.052 }, 00:15:01.052 { 00:15:01.052 "name": "BaseBdev3", 00:15:01.052 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:01.052 "is_configured": true, 00:15:01.052 "data_offset": 0, 00:15:01.052 "data_size": 65536 00:15:01.052 }, 00:15:01.052 { 00:15:01.052 "name": "BaseBdev4", 00:15:01.052 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:01.052 "is_configured": true, 00:15:01.052 "data_offset": 0, 00:15:01.052 "data_size": 65536 00:15:01.052 } 00:15:01.052 ] 00:15:01.052 }' 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.052 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.619 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.619 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:01.878 [2024-07-15 13:37:49.473519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:01.878 BaseBdev1 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:01.878 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:02.137 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:02.397 [ 00:15:02.397 { 00:15:02.397 "name": "BaseBdev1", 00:15:02.397 "aliases": [ 00:15:02.397 "90477632-7ffc-4793-9766-4bffe8c4dc48" 00:15:02.397 ], 00:15:02.397 "product_name": "Malloc disk", 00:15:02.397 "block_size": 512, 00:15:02.397 "num_blocks": 65536, 00:15:02.397 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:02.397 "assigned_rate_limits": { 00:15:02.397 "rw_ios_per_sec": 0, 00:15:02.397 "rw_mbytes_per_sec": 0, 00:15:02.397 "r_mbytes_per_sec": 0, 00:15:02.397 "w_mbytes_per_sec": 0 00:15:02.397 }, 00:15:02.397 "claimed": true, 00:15:02.397 "claim_type": "exclusive_write", 00:15:02.397 "zoned": false, 00:15:02.397 "supported_io_types": { 00:15:02.397 "read": true, 00:15:02.397 "write": true, 00:15:02.397 "unmap": true, 00:15:02.397 "flush": true, 00:15:02.397 "reset": true, 00:15:02.397 "nvme_admin": false, 00:15:02.397 "nvme_io": false, 00:15:02.397 "nvme_io_md": false, 00:15:02.397 "write_zeroes": true, 00:15:02.397 "zcopy": true, 00:15:02.397 "get_zone_info": false, 00:15:02.397 "zone_management": false, 00:15:02.397 "zone_append": false, 00:15:02.397 "compare": false, 00:15:02.397 "compare_and_write": false, 00:15:02.397 "abort": true, 00:15:02.397 "seek_hole": false, 00:15:02.397 "seek_data": false, 00:15:02.397 "copy": true, 00:15:02.397 "nvme_iov_md": false 00:15:02.397 }, 00:15:02.397 "memory_domains": [ 00:15:02.397 { 00:15:02.397 "dma_device_id": "system", 00:15:02.397 "dma_device_type": 1 00:15:02.397 }, 00:15:02.397 { 00:15:02.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.397 "dma_device_type": 2 00:15:02.397 } 00:15:02.397 ], 00:15:02.397 "driver_specific": {} 00:15:02.397 } 00:15:02.397 ] 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.397 "name": "Existed_Raid", 00:15:02.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:02.397 "strip_size_kb": 64, 00:15:02.397 "state": "configuring", 00:15:02.397 "raid_level": "raid0", 00:15:02.397 "superblock": false, 00:15:02.397 "num_base_bdevs": 4, 00:15:02.397 "num_base_bdevs_discovered": 3, 00:15:02.397 "num_base_bdevs_operational": 4, 00:15:02.397 "base_bdevs_list": [ 00:15:02.397 { 00:15:02.397 "name": "BaseBdev1", 00:15:02.397 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:02.397 "is_configured": true, 00:15:02.397 "data_offset": 0, 00:15:02.397 "data_size": 65536 00:15:02.397 }, 00:15:02.397 { 00:15:02.397 "name": null, 00:15:02.397 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:02.397 "is_configured": false, 00:15:02.397 "data_offset": 0, 00:15:02.397 "data_size": 65536 00:15:02.397 }, 00:15:02.397 { 00:15:02.397 "name": "BaseBdev3", 00:15:02.397 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:02.397 "is_configured": true, 00:15:02.397 "data_offset": 0, 00:15:02.397 "data_size": 65536 00:15:02.397 }, 00:15:02.397 { 00:15:02.397 "name": "BaseBdev4", 00:15:02.397 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:02.397 "is_configured": true, 00:15:02.397 "data_offset": 0, 00:15:02.397 "data_size": 65536 00:15:02.397 } 00:15:02.397 ] 00:15:02.397 }' 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.397 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.965 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.965 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:03.224 [2024-07-15 13:37:50.800970] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.224 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.483 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.483 "name": "Existed_Raid", 00:15:03.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.483 "strip_size_kb": 64, 00:15:03.483 "state": "configuring", 00:15:03.483 "raid_level": "raid0", 00:15:03.483 "superblock": false, 00:15:03.483 "num_base_bdevs": 4, 00:15:03.483 "num_base_bdevs_discovered": 2, 00:15:03.483 "num_base_bdevs_operational": 4, 00:15:03.483 "base_bdevs_list": [ 00:15:03.483 { 00:15:03.483 "name": "BaseBdev1", 00:15:03.483 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:03.483 "is_configured": true, 00:15:03.483 "data_offset": 0, 00:15:03.483 "data_size": 65536 00:15:03.483 }, 00:15:03.483 { 00:15:03.483 "name": null, 00:15:03.483 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:03.483 "is_configured": false, 00:15:03.483 "data_offset": 0, 00:15:03.483 "data_size": 65536 00:15:03.483 }, 00:15:03.483 { 00:15:03.483 "name": null, 00:15:03.483 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:03.483 "is_configured": false, 00:15:03.483 "data_offset": 0, 00:15:03.483 "data_size": 65536 00:15:03.483 }, 00:15:03.483 { 00:15:03.483 "name": "BaseBdev4", 00:15:03.483 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:03.483 "is_configured": true, 00:15:03.483 "data_offset": 0, 00:15:03.483 "data_size": 65536 00:15:03.483 } 00:15:03.483 ] 00:15:03.483 }' 00:15:03.483 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.483 13:37:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.051 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.051 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:04.051 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:04.051 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:04.310 [2024-07-15 13:37:51.815609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.310 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.569 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.569 "name": "Existed_Raid", 00:15:04.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.569 "strip_size_kb": 64, 00:15:04.569 "state": "configuring", 00:15:04.569 "raid_level": "raid0", 00:15:04.569 "superblock": false, 00:15:04.569 "num_base_bdevs": 4, 00:15:04.569 "num_base_bdevs_discovered": 3, 00:15:04.569 "num_base_bdevs_operational": 4, 00:15:04.569 "base_bdevs_list": [ 00:15:04.569 { 00:15:04.569 "name": "BaseBdev1", 00:15:04.569 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:04.569 "is_configured": true, 00:15:04.569 "data_offset": 0, 00:15:04.569 "data_size": 65536 00:15:04.569 }, 00:15:04.569 { 00:15:04.569 "name": null, 00:15:04.569 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:04.569 "is_configured": false, 00:15:04.569 "data_offset": 0, 00:15:04.569 "data_size": 65536 00:15:04.569 }, 00:15:04.569 { 00:15:04.569 "name": "BaseBdev3", 00:15:04.569 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:04.569 "is_configured": true, 00:15:04.569 "data_offset": 0, 00:15:04.569 "data_size": 65536 00:15:04.569 }, 00:15:04.569 { 00:15:04.569 "name": "BaseBdev4", 00:15:04.569 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:04.569 "is_configured": true, 00:15:04.569 "data_offset": 0, 00:15:04.569 "data_size": 65536 00:15:04.569 } 00:15:04.569 ] 00:15:04.569 }' 00:15:04.569 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.570 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.138 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.138 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:05.138 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:05.138 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:05.397 [2024-07-15 13:37:52.826274] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.397 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:05.656 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.656 "name": "Existed_Raid", 00:15:05.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:05.656 "strip_size_kb": 64, 00:15:05.656 "state": "configuring", 00:15:05.656 "raid_level": "raid0", 00:15:05.656 "superblock": false, 00:15:05.656 "num_base_bdevs": 4, 00:15:05.656 "num_base_bdevs_discovered": 2, 00:15:05.656 "num_base_bdevs_operational": 4, 00:15:05.656 "base_bdevs_list": [ 00:15:05.656 { 00:15:05.656 "name": null, 00:15:05.656 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:05.656 "is_configured": false, 00:15:05.656 "data_offset": 0, 00:15:05.656 "data_size": 65536 00:15:05.656 }, 00:15:05.656 { 00:15:05.656 "name": null, 00:15:05.656 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:05.656 "is_configured": false, 00:15:05.656 "data_offset": 0, 00:15:05.656 "data_size": 65536 00:15:05.656 }, 00:15:05.657 { 00:15:05.657 "name": "BaseBdev3", 00:15:05.657 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:05.657 "is_configured": true, 00:15:05.657 "data_offset": 0, 00:15:05.657 "data_size": 65536 00:15:05.657 }, 00:15:05.657 { 00:15:05.657 "name": "BaseBdev4", 00:15:05.657 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:05.657 "is_configured": true, 00:15:05.657 "data_offset": 0, 00:15:05.657 "data_size": 65536 00:15:05.657 } 00:15:05.657 ] 00:15:05.657 }' 00:15:05.657 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.657 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.252 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.252 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:06.252 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:06.252 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:06.604 [2024-07-15 13:37:53.895139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.604 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.604 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.604 "name": "Existed_Raid", 00:15:06.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.604 "strip_size_kb": 64, 00:15:06.604 "state": "configuring", 00:15:06.604 "raid_level": "raid0", 00:15:06.604 "superblock": false, 00:15:06.604 "num_base_bdevs": 4, 00:15:06.604 "num_base_bdevs_discovered": 3, 00:15:06.604 "num_base_bdevs_operational": 4, 00:15:06.604 "base_bdevs_list": [ 00:15:06.604 { 00:15:06.604 "name": null, 00:15:06.604 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:06.604 "is_configured": false, 00:15:06.604 "data_offset": 0, 00:15:06.604 "data_size": 65536 00:15:06.604 }, 00:15:06.604 { 00:15:06.604 "name": "BaseBdev2", 00:15:06.604 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:06.604 "is_configured": true, 00:15:06.604 "data_offset": 0, 00:15:06.604 "data_size": 65536 00:15:06.604 }, 00:15:06.604 { 00:15:06.604 "name": "BaseBdev3", 00:15:06.604 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:06.604 "is_configured": true, 00:15:06.604 "data_offset": 0, 00:15:06.604 "data_size": 65536 00:15:06.604 }, 00:15:06.604 { 00:15:06.604 "name": "BaseBdev4", 00:15:06.604 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:06.604 "is_configured": true, 00:15:06.604 "data_offset": 0, 00:15:06.604 "data_size": 65536 00:15:06.604 } 00:15:06.604 ] 00:15:06.604 }' 00:15:06.604 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.604 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:07.172 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:07.172 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.172 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:07.172 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.172 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:07.431 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 90477632-7ffc-4793-9766-4bffe8c4dc48 00:15:07.690 [2024-07-15 13:37:55.070105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:07.690 [2024-07-15 13:37:55.070134] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bf0420 00:15:07.690 [2024-07-15 13:37:55.070140] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:07.690 [2024-07-15 13:37:55.070275] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf5450 00:15:07.690 [2024-07-15 13:37:55.070357] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bf0420 00:15:07.690 [2024-07-15 13:37:55.070364] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bf0420 00:15:07.690 [2024-07-15 13:37:55.070501] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:07.690 NewBaseBdev 00:15:07.690 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:07.690 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:07.690 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:07.690 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:07.690 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:07.690 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:07.691 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:07.691 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:07.950 [ 00:15:07.950 { 00:15:07.950 "name": "NewBaseBdev", 00:15:07.950 "aliases": [ 00:15:07.950 "90477632-7ffc-4793-9766-4bffe8c4dc48" 00:15:07.950 ], 00:15:07.950 "product_name": "Malloc disk", 00:15:07.950 "block_size": 512, 00:15:07.950 "num_blocks": 65536, 00:15:07.950 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:07.950 "assigned_rate_limits": { 00:15:07.950 "rw_ios_per_sec": 0, 00:15:07.950 "rw_mbytes_per_sec": 0, 00:15:07.950 "r_mbytes_per_sec": 0, 00:15:07.950 "w_mbytes_per_sec": 0 00:15:07.950 }, 00:15:07.950 "claimed": true, 00:15:07.950 "claim_type": "exclusive_write", 00:15:07.950 "zoned": false, 00:15:07.950 "supported_io_types": { 00:15:07.950 "read": true, 00:15:07.950 "write": true, 00:15:07.950 "unmap": true, 00:15:07.950 "flush": true, 00:15:07.950 "reset": true, 00:15:07.950 "nvme_admin": false, 00:15:07.950 "nvme_io": false, 00:15:07.950 "nvme_io_md": false, 00:15:07.950 "write_zeroes": true, 00:15:07.950 "zcopy": true, 00:15:07.950 "get_zone_info": false, 00:15:07.950 "zone_management": false, 00:15:07.950 "zone_append": false, 00:15:07.950 "compare": false, 00:15:07.950 "compare_and_write": false, 00:15:07.950 "abort": true, 00:15:07.950 "seek_hole": false, 00:15:07.950 "seek_data": false, 00:15:07.950 "copy": true, 00:15:07.950 "nvme_iov_md": false 00:15:07.950 }, 00:15:07.950 "memory_domains": [ 00:15:07.950 { 00:15:07.950 "dma_device_id": "system", 00:15:07.950 "dma_device_type": 1 00:15:07.950 }, 00:15:07.950 { 00:15:07.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.950 "dma_device_type": 2 00:15:07.950 } 00:15:07.950 ], 00:15:07.950 "driver_specific": {} 00:15:07.950 } 00:15:07.950 ] 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.950 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.210 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.210 "name": "Existed_Raid", 00:15:08.210 "uuid": "18382264-ab05-4c27-8cf7-7f24378b92fe", 00:15:08.210 "strip_size_kb": 64, 00:15:08.210 "state": "online", 00:15:08.210 "raid_level": "raid0", 00:15:08.210 "superblock": false, 00:15:08.210 "num_base_bdevs": 4, 00:15:08.210 "num_base_bdevs_discovered": 4, 00:15:08.210 "num_base_bdevs_operational": 4, 00:15:08.210 "base_bdevs_list": [ 00:15:08.210 { 00:15:08.210 "name": "NewBaseBdev", 00:15:08.210 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:08.210 "is_configured": true, 00:15:08.210 "data_offset": 0, 00:15:08.210 "data_size": 65536 00:15:08.210 }, 00:15:08.210 { 00:15:08.210 "name": "BaseBdev2", 00:15:08.210 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:08.210 "is_configured": true, 00:15:08.210 "data_offset": 0, 00:15:08.210 "data_size": 65536 00:15:08.210 }, 00:15:08.210 { 00:15:08.210 "name": "BaseBdev3", 00:15:08.210 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:08.210 "is_configured": true, 00:15:08.210 "data_offset": 0, 00:15:08.210 "data_size": 65536 00:15:08.210 }, 00:15:08.210 { 00:15:08.210 "name": "BaseBdev4", 00:15:08.210 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:08.210 "is_configured": true, 00:15:08.210 "data_offset": 0, 00:15:08.210 "data_size": 65536 00:15:08.210 } 00:15:08.210 ] 00:15:08.210 }' 00:15:08.210 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.210 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.778 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:08.778 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:08.778 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:08.779 [2024-07-15 13:37:56.257401] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:08.779 "name": "Existed_Raid", 00:15:08.779 "aliases": [ 00:15:08.779 "18382264-ab05-4c27-8cf7-7f24378b92fe" 00:15:08.779 ], 00:15:08.779 "product_name": "Raid Volume", 00:15:08.779 "block_size": 512, 00:15:08.779 "num_blocks": 262144, 00:15:08.779 "uuid": "18382264-ab05-4c27-8cf7-7f24378b92fe", 00:15:08.779 "assigned_rate_limits": { 00:15:08.779 "rw_ios_per_sec": 0, 00:15:08.779 "rw_mbytes_per_sec": 0, 00:15:08.779 "r_mbytes_per_sec": 0, 00:15:08.779 "w_mbytes_per_sec": 0 00:15:08.779 }, 00:15:08.779 "claimed": false, 00:15:08.779 "zoned": false, 00:15:08.779 "supported_io_types": { 00:15:08.779 "read": true, 00:15:08.779 "write": true, 00:15:08.779 "unmap": true, 00:15:08.779 "flush": true, 00:15:08.779 "reset": true, 00:15:08.779 "nvme_admin": false, 00:15:08.779 "nvme_io": false, 00:15:08.779 "nvme_io_md": false, 00:15:08.779 "write_zeroes": true, 00:15:08.779 "zcopy": false, 00:15:08.779 "get_zone_info": false, 00:15:08.779 "zone_management": false, 00:15:08.779 "zone_append": false, 00:15:08.779 "compare": false, 00:15:08.779 "compare_and_write": false, 00:15:08.779 "abort": false, 00:15:08.779 "seek_hole": false, 00:15:08.779 "seek_data": false, 00:15:08.779 "copy": false, 00:15:08.779 "nvme_iov_md": false 00:15:08.779 }, 00:15:08.779 "memory_domains": [ 00:15:08.779 { 00:15:08.779 "dma_device_id": "system", 00:15:08.779 "dma_device_type": 1 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.779 "dma_device_type": 2 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "dma_device_id": "system", 00:15:08.779 "dma_device_type": 1 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.779 "dma_device_type": 2 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "dma_device_id": "system", 00:15:08.779 "dma_device_type": 1 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.779 "dma_device_type": 2 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "dma_device_id": "system", 00:15:08.779 "dma_device_type": 1 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.779 "dma_device_type": 2 00:15:08.779 } 00:15:08.779 ], 00:15:08.779 "driver_specific": { 00:15:08.779 "raid": { 00:15:08.779 "uuid": "18382264-ab05-4c27-8cf7-7f24378b92fe", 00:15:08.779 "strip_size_kb": 64, 00:15:08.779 "state": "online", 00:15:08.779 "raid_level": "raid0", 00:15:08.779 "superblock": false, 00:15:08.779 "num_base_bdevs": 4, 00:15:08.779 "num_base_bdevs_discovered": 4, 00:15:08.779 "num_base_bdevs_operational": 4, 00:15:08.779 "base_bdevs_list": [ 00:15:08.779 { 00:15:08.779 "name": "NewBaseBdev", 00:15:08.779 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:08.779 "is_configured": true, 00:15:08.779 "data_offset": 0, 00:15:08.779 "data_size": 65536 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "name": "BaseBdev2", 00:15:08.779 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:08.779 "is_configured": true, 00:15:08.779 "data_offset": 0, 00:15:08.779 "data_size": 65536 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "name": "BaseBdev3", 00:15:08.779 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:08.779 "is_configured": true, 00:15:08.779 "data_offset": 0, 00:15:08.779 "data_size": 65536 00:15:08.779 }, 00:15:08.779 { 00:15:08.779 "name": "BaseBdev4", 00:15:08.779 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:08.779 "is_configured": true, 00:15:08.779 "data_offset": 0, 00:15:08.779 "data_size": 65536 00:15:08.779 } 00:15:08.779 ] 00:15:08.779 } 00:15:08.779 } 00:15:08.779 }' 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:08.779 BaseBdev2 00:15:08.779 BaseBdev3 00:15:08.779 BaseBdev4' 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:08.779 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.039 "name": "NewBaseBdev", 00:15:09.039 "aliases": [ 00:15:09.039 "90477632-7ffc-4793-9766-4bffe8c4dc48" 00:15:09.039 ], 00:15:09.039 "product_name": "Malloc disk", 00:15:09.039 "block_size": 512, 00:15:09.039 "num_blocks": 65536, 00:15:09.039 "uuid": "90477632-7ffc-4793-9766-4bffe8c4dc48", 00:15:09.039 "assigned_rate_limits": { 00:15:09.039 "rw_ios_per_sec": 0, 00:15:09.039 "rw_mbytes_per_sec": 0, 00:15:09.039 "r_mbytes_per_sec": 0, 00:15:09.039 "w_mbytes_per_sec": 0 00:15:09.039 }, 00:15:09.039 "claimed": true, 00:15:09.039 "claim_type": "exclusive_write", 00:15:09.039 "zoned": false, 00:15:09.039 "supported_io_types": { 00:15:09.039 "read": true, 00:15:09.039 "write": true, 00:15:09.039 "unmap": true, 00:15:09.039 "flush": true, 00:15:09.039 "reset": true, 00:15:09.039 "nvme_admin": false, 00:15:09.039 "nvme_io": false, 00:15:09.039 "nvme_io_md": false, 00:15:09.039 "write_zeroes": true, 00:15:09.039 "zcopy": true, 00:15:09.039 "get_zone_info": false, 00:15:09.039 "zone_management": false, 00:15:09.039 "zone_append": false, 00:15:09.039 "compare": false, 00:15:09.039 "compare_and_write": false, 00:15:09.039 "abort": true, 00:15:09.039 "seek_hole": false, 00:15:09.039 "seek_data": false, 00:15:09.039 "copy": true, 00:15:09.039 "nvme_iov_md": false 00:15:09.039 }, 00:15:09.039 "memory_domains": [ 00:15:09.039 { 00:15:09.039 "dma_device_id": "system", 00:15:09.039 "dma_device_type": 1 00:15:09.039 }, 00:15:09.039 { 00:15:09.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.039 "dma_device_type": 2 00:15:09.039 } 00:15:09.039 ], 00:15:09.039 "driver_specific": {} 00:15:09.039 }' 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.039 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.298 "name": "BaseBdev2", 00:15:09.298 "aliases": [ 00:15:09.298 "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca" 00:15:09.298 ], 00:15:09.298 "product_name": "Malloc disk", 00:15:09.298 "block_size": 512, 00:15:09.298 "num_blocks": 65536, 00:15:09.298 "uuid": "bc071dd2-e40d-4cb6-b025-ae2371f6b6ca", 00:15:09.298 "assigned_rate_limits": { 00:15:09.298 "rw_ios_per_sec": 0, 00:15:09.298 "rw_mbytes_per_sec": 0, 00:15:09.298 "r_mbytes_per_sec": 0, 00:15:09.298 "w_mbytes_per_sec": 0 00:15:09.298 }, 00:15:09.298 "claimed": true, 00:15:09.298 "claim_type": "exclusive_write", 00:15:09.298 "zoned": false, 00:15:09.298 "supported_io_types": { 00:15:09.298 "read": true, 00:15:09.298 "write": true, 00:15:09.298 "unmap": true, 00:15:09.298 "flush": true, 00:15:09.298 "reset": true, 00:15:09.298 "nvme_admin": false, 00:15:09.298 "nvme_io": false, 00:15:09.298 "nvme_io_md": false, 00:15:09.298 "write_zeroes": true, 00:15:09.298 "zcopy": true, 00:15:09.298 "get_zone_info": false, 00:15:09.298 "zone_management": false, 00:15:09.298 "zone_append": false, 00:15:09.298 "compare": false, 00:15:09.298 "compare_and_write": false, 00:15:09.298 "abort": true, 00:15:09.298 "seek_hole": false, 00:15:09.298 "seek_data": false, 00:15:09.298 "copy": true, 00:15:09.298 "nvme_iov_md": false 00:15:09.298 }, 00:15:09.298 "memory_domains": [ 00:15:09.298 { 00:15:09.298 "dma_device_id": "system", 00:15:09.298 "dma_device_type": 1 00:15:09.298 }, 00:15:09.298 { 00:15:09.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.298 "dma_device_type": 2 00:15:09.298 } 00:15:09.298 ], 00:15:09.298 "driver_specific": {} 00:15:09.298 }' 00:15:09.298 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.557 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.557 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.557 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.557 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.557 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:09.557 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.557 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.557 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.557 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.557 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.817 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:09.817 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:09.817 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.817 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:09.817 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.817 "name": "BaseBdev3", 00:15:09.817 "aliases": [ 00:15:09.817 "70c918a7-3ba0-45ff-8a16-1ad1c574b731" 00:15:09.817 ], 00:15:09.817 "product_name": "Malloc disk", 00:15:09.817 "block_size": 512, 00:15:09.817 "num_blocks": 65536, 00:15:09.817 "uuid": "70c918a7-3ba0-45ff-8a16-1ad1c574b731", 00:15:09.817 "assigned_rate_limits": { 00:15:09.817 "rw_ios_per_sec": 0, 00:15:09.817 "rw_mbytes_per_sec": 0, 00:15:09.817 "r_mbytes_per_sec": 0, 00:15:09.817 "w_mbytes_per_sec": 0 00:15:09.817 }, 00:15:09.817 "claimed": true, 00:15:09.817 "claim_type": "exclusive_write", 00:15:09.817 "zoned": false, 00:15:09.817 "supported_io_types": { 00:15:09.817 "read": true, 00:15:09.817 "write": true, 00:15:09.817 "unmap": true, 00:15:09.817 "flush": true, 00:15:09.817 "reset": true, 00:15:09.817 "nvme_admin": false, 00:15:09.817 "nvme_io": false, 00:15:09.817 "nvme_io_md": false, 00:15:09.817 "write_zeroes": true, 00:15:09.817 "zcopy": true, 00:15:09.817 "get_zone_info": false, 00:15:09.817 "zone_management": false, 00:15:09.817 "zone_append": false, 00:15:09.817 "compare": false, 00:15:09.817 "compare_and_write": false, 00:15:09.817 "abort": true, 00:15:09.817 "seek_hole": false, 00:15:09.817 "seek_data": false, 00:15:09.817 "copy": true, 00:15:09.817 "nvme_iov_md": false 00:15:09.817 }, 00:15:09.817 "memory_domains": [ 00:15:09.817 { 00:15:09.817 "dma_device_id": "system", 00:15:09.817 "dma_device_type": 1 00:15:09.817 }, 00:15:09.817 { 00:15:09.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.817 "dma_device_type": 2 00:15:09.817 } 00:15:09.817 ], 00:15:09.817 "driver_specific": {} 00:15:09.817 }' 00:15:09.817 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.076 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.335 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.335 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:10.335 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:10.335 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:10.335 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:10.335 "name": "BaseBdev4", 00:15:10.335 "aliases": [ 00:15:10.335 "171463aa-31b3-4e64-912e-26bfec6a1fc7" 00:15:10.335 ], 00:15:10.335 "product_name": "Malloc disk", 00:15:10.335 "block_size": 512, 00:15:10.335 "num_blocks": 65536, 00:15:10.335 "uuid": "171463aa-31b3-4e64-912e-26bfec6a1fc7", 00:15:10.335 "assigned_rate_limits": { 00:15:10.335 "rw_ios_per_sec": 0, 00:15:10.335 "rw_mbytes_per_sec": 0, 00:15:10.335 "r_mbytes_per_sec": 0, 00:15:10.335 "w_mbytes_per_sec": 0 00:15:10.335 }, 00:15:10.335 "claimed": true, 00:15:10.335 "claim_type": "exclusive_write", 00:15:10.335 "zoned": false, 00:15:10.335 "supported_io_types": { 00:15:10.335 "read": true, 00:15:10.335 "write": true, 00:15:10.335 "unmap": true, 00:15:10.335 "flush": true, 00:15:10.335 "reset": true, 00:15:10.335 "nvme_admin": false, 00:15:10.335 "nvme_io": false, 00:15:10.335 "nvme_io_md": false, 00:15:10.335 "write_zeroes": true, 00:15:10.335 "zcopy": true, 00:15:10.335 "get_zone_info": false, 00:15:10.335 "zone_management": false, 00:15:10.335 "zone_append": false, 00:15:10.335 "compare": false, 00:15:10.335 "compare_and_write": false, 00:15:10.335 "abort": true, 00:15:10.335 "seek_hole": false, 00:15:10.335 "seek_data": false, 00:15:10.335 "copy": true, 00:15:10.335 "nvme_iov_md": false 00:15:10.335 }, 00:15:10.335 "memory_domains": [ 00:15:10.335 { 00:15:10.335 "dma_device_id": "system", 00:15:10.335 "dma_device_type": 1 00:15:10.335 }, 00:15:10.335 { 00:15:10.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.335 "dma_device_type": 2 00:15:10.335 } 00:15:10.335 ], 00:15:10.335 "driver_specific": {} 00:15:10.335 }' 00:15:10.335 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.335 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.595 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.595 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.595 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.595 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.595 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.595 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.595 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.595 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.595 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:10.855 [2024-07-15 13:37:58.358606] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:10.855 [2024-07-15 13:37:58.358627] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:10.855 [2024-07-15 13:37:58.358665] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:10.855 [2024-07-15 13:37:58.358707] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:10.855 [2024-07-15 13:37:58.358715] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bf0420 name Existed_Raid, state offline 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 21292 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 21292 ']' 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 21292 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 21292 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 21292' 00:15:10.855 killing process with pid 21292 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 21292 00:15:10.855 [2024-07-15 13:37:58.426769] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:10.855 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 21292 00:15:10.855 [2024-07-15 13:37:58.463386] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:11.114 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:11.114 00:15:11.114 real 0m24.896s 00:15:11.114 user 0m45.399s 00:15:11.114 sys 0m4.865s 00:15:11.114 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:11.114 13:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.114 ************************************ 00:15:11.114 END TEST raid_state_function_test 00:15:11.114 ************************************ 00:15:11.114 13:37:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:11.114 13:37:58 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:15:11.114 13:37:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:11.114 13:37:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:11.114 13:37:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:11.114 ************************************ 00:15:11.114 START TEST raid_state_function_test_sb 00:15:11.114 ************************************ 00:15:11.114 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:15:11.114 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:11.114 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=25248 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 25248' 00:15:11.374 Process raid pid: 25248 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 25248 /var/tmp/spdk-raid.sock 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 25248 ']' 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:11.374 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:11.374 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.374 [2024-07-15 13:37:58.792319] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:15:11.374 [2024-07-15 13:37:58.792370] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:11.374 [2024-07-15 13:37:58.880476] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:11.374 [2024-07-15 13:37:58.977615] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:11.633 [2024-07-15 13:37:59.035438] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:11.633 [2024-07-15 13:37:59.035460] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:12.202 [2024-07-15 13:37:59.771450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:12.202 [2024-07-15 13:37:59.771484] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:12.202 [2024-07-15 13:37:59.771492] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:12.202 [2024-07-15 13:37:59.771499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:12.202 [2024-07-15 13:37:59.771504] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:12.202 [2024-07-15 13:37:59.771511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:12.202 [2024-07-15 13:37:59.771517] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:12.202 [2024-07-15 13:37:59.771523] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.202 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.460 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.460 "name": "Existed_Raid", 00:15:12.460 "uuid": "60b0d8ef-5e78-413c-9bf0-fe2db8280eff", 00:15:12.460 "strip_size_kb": 64, 00:15:12.460 "state": "configuring", 00:15:12.460 "raid_level": "raid0", 00:15:12.460 "superblock": true, 00:15:12.460 "num_base_bdevs": 4, 00:15:12.460 "num_base_bdevs_discovered": 0, 00:15:12.460 "num_base_bdevs_operational": 4, 00:15:12.460 "base_bdevs_list": [ 00:15:12.460 { 00:15:12.460 "name": "BaseBdev1", 00:15:12.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.460 "is_configured": false, 00:15:12.460 "data_offset": 0, 00:15:12.460 "data_size": 0 00:15:12.460 }, 00:15:12.460 { 00:15:12.460 "name": "BaseBdev2", 00:15:12.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.460 "is_configured": false, 00:15:12.460 "data_offset": 0, 00:15:12.460 "data_size": 0 00:15:12.460 }, 00:15:12.460 { 00:15:12.460 "name": "BaseBdev3", 00:15:12.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.460 "is_configured": false, 00:15:12.460 "data_offset": 0, 00:15:12.460 "data_size": 0 00:15:12.460 }, 00:15:12.460 { 00:15:12.460 "name": "BaseBdev4", 00:15:12.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.460 "is_configured": false, 00:15:12.460 "data_offset": 0, 00:15:12.460 "data_size": 0 00:15:12.460 } 00:15:12.460 ] 00:15:12.460 }' 00:15:12.460 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.460 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:13.026 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:13.026 [2024-07-15 13:38:00.605517] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:13.026 [2024-07-15 13:38:00.605544] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2289f70 name Existed_Raid, state configuring 00:15:13.026 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:13.285 [2024-07-15 13:38:00.781998] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:13.285 [2024-07-15 13:38:00.782025] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:13.285 [2024-07-15 13:38:00.782032] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:13.285 [2024-07-15 13:38:00.782039] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:13.285 [2024-07-15 13:38:00.782062] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:13.285 [2024-07-15 13:38:00.782069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:13.285 [2024-07-15 13:38:00.782075] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:13.285 [2024-07-15 13:38:00.782082] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:13.285 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:13.542 [2024-07-15 13:38:00.955078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:13.542 BaseBdev1 00:15:13.542 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:13.542 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:13.542 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:13.542 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:13.542 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:13.542 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:13.542 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:13.542 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:13.800 [ 00:15:13.800 { 00:15:13.800 "name": "BaseBdev1", 00:15:13.800 "aliases": [ 00:15:13.800 "c00f225a-3fa6-4bb2-84fe-6bd76457651b" 00:15:13.800 ], 00:15:13.800 "product_name": "Malloc disk", 00:15:13.800 "block_size": 512, 00:15:13.800 "num_blocks": 65536, 00:15:13.800 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:13.800 "assigned_rate_limits": { 00:15:13.800 "rw_ios_per_sec": 0, 00:15:13.800 "rw_mbytes_per_sec": 0, 00:15:13.800 "r_mbytes_per_sec": 0, 00:15:13.800 "w_mbytes_per_sec": 0 00:15:13.800 }, 00:15:13.800 "claimed": true, 00:15:13.800 "claim_type": "exclusive_write", 00:15:13.800 "zoned": false, 00:15:13.800 "supported_io_types": { 00:15:13.800 "read": true, 00:15:13.800 "write": true, 00:15:13.800 "unmap": true, 00:15:13.800 "flush": true, 00:15:13.800 "reset": true, 00:15:13.800 "nvme_admin": false, 00:15:13.800 "nvme_io": false, 00:15:13.800 "nvme_io_md": false, 00:15:13.800 "write_zeroes": true, 00:15:13.800 "zcopy": true, 00:15:13.800 "get_zone_info": false, 00:15:13.800 "zone_management": false, 00:15:13.800 "zone_append": false, 00:15:13.800 "compare": false, 00:15:13.800 "compare_and_write": false, 00:15:13.800 "abort": true, 00:15:13.800 "seek_hole": false, 00:15:13.800 "seek_data": false, 00:15:13.800 "copy": true, 00:15:13.800 "nvme_iov_md": false 00:15:13.800 }, 00:15:13.800 "memory_domains": [ 00:15:13.800 { 00:15:13.800 "dma_device_id": "system", 00:15:13.800 "dma_device_type": 1 00:15:13.800 }, 00:15:13.800 { 00:15:13.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.800 "dma_device_type": 2 00:15:13.800 } 00:15:13.800 ], 00:15:13.800 "driver_specific": {} 00:15:13.800 } 00:15:13.800 ] 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.800 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.059 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.059 "name": "Existed_Raid", 00:15:14.059 "uuid": "815f0a4b-71c8-4e89-bcc0-80e151447d1d", 00:15:14.059 "strip_size_kb": 64, 00:15:14.059 "state": "configuring", 00:15:14.059 "raid_level": "raid0", 00:15:14.059 "superblock": true, 00:15:14.059 "num_base_bdevs": 4, 00:15:14.059 "num_base_bdevs_discovered": 1, 00:15:14.059 "num_base_bdevs_operational": 4, 00:15:14.059 "base_bdevs_list": [ 00:15:14.059 { 00:15:14.059 "name": "BaseBdev1", 00:15:14.059 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:14.059 "is_configured": true, 00:15:14.059 "data_offset": 2048, 00:15:14.059 "data_size": 63488 00:15:14.059 }, 00:15:14.059 { 00:15:14.059 "name": "BaseBdev2", 00:15:14.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.059 "is_configured": false, 00:15:14.059 "data_offset": 0, 00:15:14.059 "data_size": 0 00:15:14.059 }, 00:15:14.059 { 00:15:14.059 "name": "BaseBdev3", 00:15:14.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.059 "is_configured": false, 00:15:14.059 "data_offset": 0, 00:15:14.059 "data_size": 0 00:15:14.059 }, 00:15:14.059 { 00:15:14.059 "name": "BaseBdev4", 00:15:14.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.059 "is_configured": false, 00:15:14.059 "data_offset": 0, 00:15:14.059 "data_size": 0 00:15:14.059 } 00:15:14.059 ] 00:15:14.059 }' 00:15:14.059 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.059 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:14.627 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:14.627 [2024-07-15 13:38:02.138127] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:14.627 [2024-07-15 13:38:02.138160] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22897e0 name Existed_Raid, state configuring 00:15:14.627 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:14.885 [2024-07-15 13:38:02.314616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:14.885 [2024-07-15 13:38:02.315654] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:14.885 [2024-07-15 13:38:02.315680] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:14.885 [2024-07-15 13:38:02.315687] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:14.885 [2024-07-15 13:38:02.315695] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:14.885 [2024-07-15 13:38:02.315701] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:14.885 [2024-07-15 13:38:02.315709] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.885 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.143 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.143 "name": "Existed_Raid", 00:15:15.143 "uuid": "433f7a8a-08d8-45b5-8d0c-045f470d6819", 00:15:15.143 "strip_size_kb": 64, 00:15:15.143 "state": "configuring", 00:15:15.143 "raid_level": "raid0", 00:15:15.143 "superblock": true, 00:15:15.143 "num_base_bdevs": 4, 00:15:15.143 "num_base_bdevs_discovered": 1, 00:15:15.143 "num_base_bdevs_operational": 4, 00:15:15.143 "base_bdevs_list": [ 00:15:15.143 { 00:15:15.143 "name": "BaseBdev1", 00:15:15.144 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:15.144 "is_configured": true, 00:15:15.144 "data_offset": 2048, 00:15:15.144 "data_size": 63488 00:15:15.144 }, 00:15:15.144 { 00:15:15.144 "name": "BaseBdev2", 00:15:15.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.144 "is_configured": false, 00:15:15.144 "data_offset": 0, 00:15:15.144 "data_size": 0 00:15:15.144 }, 00:15:15.144 { 00:15:15.144 "name": "BaseBdev3", 00:15:15.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.144 "is_configured": false, 00:15:15.144 "data_offset": 0, 00:15:15.144 "data_size": 0 00:15:15.144 }, 00:15:15.144 { 00:15:15.144 "name": "BaseBdev4", 00:15:15.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.144 "is_configured": false, 00:15:15.144 "data_offset": 0, 00:15:15.144 "data_size": 0 00:15:15.144 } 00:15:15.144 ] 00:15:15.144 }' 00:15:15.144 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.144 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:15.401 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:15.659 [2024-07-15 13:38:03.171588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:15.659 BaseBdev2 00:15:15.659 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:15.659 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:15.659 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:15.659 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:15.659 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:15.659 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:15.659 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.917 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:15.917 [ 00:15:15.917 { 00:15:15.917 "name": "BaseBdev2", 00:15:15.917 "aliases": [ 00:15:15.917 "8d51585a-5b6b-4745-a082-3f06e7a732a7" 00:15:15.917 ], 00:15:15.917 "product_name": "Malloc disk", 00:15:15.917 "block_size": 512, 00:15:15.917 "num_blocks": 65536, 00:15:15.917 "uuid": "8d51585a-5b6b-4745-a082-3f06e7a732a7", 00:15:15.917 "assigned_rate_limits": { 00:15:15.917 "rw_ios_per_sec": 0, 00:15:15.917 "rw_mbytes_per_sec": 0, 00:15:15.917 "r_mbytes_per_sec": 0, 00:15:15.917 "w_mbytes_per_sec": 0 00:15:15.917 }, 00:15:15.917 "claimed": true, 00:15:15.917 "claim_type": "exclusive_write", 00:15:15.917 "zoned": false, 00:15:15.917 "supported_io_types": { 00:15:15.917 "read": true, 00:15:15.917 "write": true, 00:15:15.917 "unmap": true, 00:15:15.917 "flush": true, 00:15:15.917 "reset": true, 00:15:15.917 "nvme_admin": false, 00:15:15.917 "nvme_io": false, 00:15:15.917 "nvme_io_md": false, 00:15:15.917 "write_zeroes": true, 00:15:15.917 "zcopy": true, 00:15:15.917 "get_zone_info": false, 00:15:15.917 "zone_management": false, 00:15:15.917 "zone_append": false, 00:15:15.917 "compare": false, 00:15:15.917 "compare_and_write": false, 00:15:15.917 "abort": true, 00:15:15.917 "seek_hole": false, 00:15:15.917 "seek_data": false, 00:15:15.917 "copy": true, 00:15:15.917 "nvme_iov_md": false 00:15:15.917 }, 00:15:15.917 "memory_domains": [ 00:15:15.917 { 00:15:15.917 "dma_device_id": "system", 00:15:15.917 "dma_device_type": 1 00:15:15.917 }, 00:15:15.917 { 00:15:15.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.917 "dma_device_type": 2 00:15:15.917 } 00:15:15.917 ], 00:15:15.917 "driver_specific": {} 00:15:15.917 } 00:15:15.917 ] 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.174 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.174 "name": "Existed_Raid", 00:15:16.174 "uuid": "433f7a8a-08d8-45b5-8d0c-045f470d6819", 00:15:16.174 "strip_size_kb": 64, 00:15:16.174 "state": "configuring", 00:15:16.175 "raid_level": "raid0", 00:15:16.175 "superblock": true, 00:15:16.175 "num_base_bdevs": 4, 00:15:16.175 "num_base_bdevs_discovered": 2, 00:15:16.175 "num_base_bdevs_operational": 4, 00:15:16.175 "base_bdevs_list": [ 00:15:16.175 { 00:15:16.175 "name": "BaseBdev1", 00:15:16.175 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:16.175 "is_configured": true, 00:15:16.175 "data_offset": 2048, 00:15:16.175 "data_size": 63488 00:15:16.175 }, 00:15:16.175 { 00:15:16.175 "name": "BaseBdev2", 00:15:16.175 "uuid": "8d51585a-5b6b-4745-a082-3f06e7a732a7", 00:15:16.175 "is_configured": true, 00:15:16.175 "data_offset": 2048, 00:15:16.175 "data_size": 63488 00:15:16.175 }, 00:15:16.175 { 00:15:16.175 "name": "BaseBdev3", 00:15:16.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.175 "is_configured": false, 00:15:16.175 "data_offset": 0, 00:15:16.175 "data_size": 0 00:15:16.175 }, 00:15:16.175 { 00:15:16.175 "name": "BaseBdev4", 00:15:16.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.175 "is_configured": false, 00:15:16.175 "data_offset": 0, 00:15:16.175 "data_size": 0 00:15:16.175 } 00:15:16.175 ] 00:15:16.175 }' 00:15:16.175 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.175 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:16.739 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:16.996 [2024-07-15 13:38:04.377518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:16.996 BaseBdev3 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:16.996 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:17.254 [ 00:15:17.254 { 00:15:17.254 "name": "BaseBdev3", 00:15:17.254 "aliases": [ 00:15:17.254 "fc63f8fa-8533-401d-a8aa-f4cc352ac364" 00:15:17.254 ], 00:15:17.254 "product_name": "Malloc disk", 00:15:17.254 "block_size": 512, 00:15:17.254 "num_blocks": 65536, 00:15:17.254 "uuid": "fc63f8fa-8533-401d-a8aa-f4cc352ac364", 00:15:17.254 "assigned_rate_limits": { 00:15:17.254 "rw_ios_per_sec": 0, 00:15:17.254 "rw_mbytes_per_sec": 0, 00:15:17.254 "r_mbytes_per_sec": 0, 00:15:17.254 "w_mbytes_per_sec": 0 00:15:17.254 }, 00:15:17.254 "claimed": true, 00:15:17.254 "claim_type": "exclusive_write", 00:15:17.254 "zoned": false, 00:15:17.254 "supported_io_types": { 00:15:17.254 "read": true, 00:15:17.254 "write": true, 00:15:17.254 "unmap": true, 00:15:17.254 "flush": true, 00:15:17.254 "reset": true, 00:15:17.254 "nvme_admin": false, 00:15:17.254 "nvme_io": false, 00:15:17.254 "nvme_io_md": false, 00:15:17.254 "write_zeroes": true, 00:15:17.254 "zcopy": true, 00:15:17.254 "get_zone_info": false, 00:15:17.254 "zone_management": false, 00:15:17.254 "zone_append": false, 00:15:17.254 "compare": false, 00:15:17.254 "compare_and_write": false, 00:15:17.254 "abort": true, 00:15:17.254 "seek_hole": false, 00:15:17.254 "seek_data": false, 00:15:17.254 "copy": true, 00:15:17.254 "nvme_iov_md": false 00:15:17.254 }, 00:15:17.254 "memory_domains": [ 00:15:17.254 { 00:15:17.254 "dma_device_id": "system", 00:15:17.254 "dma_device_type": 1 00:15:17.254 }, 00:15:17.254 { 00:15:17.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.254 "dma_device_type": 2 00:15:17.254 } 00:15:17.254 ], 00:15:17.254 "driver_specific": {} 00:15:17.254 } 00:15:17.254 ] 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.254 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.512 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.512 "name": "Existed_Raid", 00:15:17.512 "uuid": "433f7a8a-08d8-45b5-8d0c-045f470d6819", 00:15:17.512 "strip_size_kb": 64, 00:15:17.512 "state": "configuring", 00:15:17.512 "raid_level": "raid0", 00:15:17.512 "superblock": true, 00:15:17.512 "num_base_bdevs": 4, 00:15:17.512 "num_base_bdevs_discovered": 3, 00:15:17.512 "num_base_bdevs_operational": 4, 00:15:17.512 "base_bdevs_list": [ 00:15:17.512 { 00:15:17.512 "name": "BaseBdev1", 00:15:17.512 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:17.512 "is_configured": true, 00:15:17.512 "data_offset": 2048, 00:15:17.512 "data_size": 63488 00:15:17.512 }, 00:15:17.512 { 00:15:17.512 "name": "BaseBdev2", 00:15:17.512 "uuid": "8d51585a-5b6b-4745-a082-3f06e7a732a7", 00:15:17.512 "is_configured": true, 00:15:17.512 "data_offset": 2048, 00:15:17.512 "data_size": 63488 00:15:17.512 }, 00:15:17.512 { 00:15:17.512 "name": "BaseBdev3", 00:15:17.512 "uuid": "fc63f8fa-8533-401d-a8aa-f4cc352ac364", 00:15:17.512 "is_configured": true, 00:15:17.512 "data_offset": 2048, 00:15:17.512 "data_size": 63488 00:15:17.512 }, 00:15:17.512 { 00:15:17.512 "name": "BaseBdev4", 00:15:17.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.512 "is_configured": false, 00:15:17.512 "data_offset": 0, 00:15:17.512 "data_size": 0 00:15:17.512 } 00:15:17.512 ] 00:15:17.512 }' 00:15:17.512 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.512 13:38:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.770 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:18.027 [2024-07-15 13:38:05.531565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:18.027 [2024-07-15 13:38:05.531715] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x228a840 00:15:18.027 [2024-07-15 13:38:05.531725] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:18.027 [2024-07-15 13:38:05.531848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x228a480 00:15:18.027 [2024-07-15 13:38:05.531935] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228a840 00:15:18.027 [2024-07-15 13:38:05.531942] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x228a840 00:15:18.027 [2024-07-15 13:38:05.532015] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:18.027 BaseBdev4 00:15:18.027 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:18.027 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:18.027 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:18.027 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:18.027 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:18.028 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:18.028 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.285 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:18.285 [ 00:15:18.285 { 00:15:18.285 "name": "BaseBdev4", 00:15:18.285 "aliases": [ 00:15:18.285 "defb0952-1431-4ff2-92f8-b752ff9bc9de" 00:15:18.285 ], 00:15:18.285 "product_name": "Malloc disk", 00:15:18.285 "block_size": 512, 00:15:18.285 "num_blocks": 65536, 00:15:18.285 "uuid": "defb0952-1431-4ff2-92f8-b752ff9bc9de", 00:15:18.285 "assigned_rate_limits": { 00:15:18.285 "rw_ios_per_sec": 0, 00:15:18.285 "rw_mbytes_per_sec": 0, 00:15:18.285 "r_mbytes_per_sec": 0, 00:15:18.285 "w_mbytes_per_sec": 0 00:15:18.285 }, 00:15:18.285 "claimed": true, 00:15:18.285 "claim_type": "exclusive_write", 00:15:18.285 "zoned": false, 00:15:18.285 "supported_io_types": { 00:15:18.285 "read": true, 00:15:18.285 "write": true, 00:15:18.285 "unmap": true, 00:15:18.285 "flush": true, 00:15:18.285 "reset": true, 00:15:18.285 "nvme_admin": false, 00:15:18.285 "nvme_io": false, 00:15:18.285 "nvme_io_md": false, 00:15:18.285 "write_zeroes": true, 00:15:18.285 "zcopy": true, 00:15:18.285 "get_zone_info": false, 00:15:18.285 "zone_management": false, 00:15:18.285 "zone_append": false, 00:15:18.285 "compare": false, 00:15:18.285 "compare_and_write": false, 00:15:18.285 "abort": true, 00:15:18.285 "seek_hole": false, 00:15:18.285 "seek_data": false, 00:15:18.285 "copy": true, 00:15:18.285 "nvme_iov_md": false 00:15:18.285 }, 00:15:18.285 "memory_domains": [ 00:15:18.285 { 00:15:18.285 "dma_device_id": "system", 00:15:18.285 "dma_device_type": 1 00:15:18.285 }, 00:15:18.285 { 00:15:18.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.285 "dma_device_type": 2 00:15:18.285 } 00:15:18.285 ], 00:15:18.285 "driver_specific": {} 00:15:18.285 } 00:15:18.285 ] 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.286 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.543 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.543 "name": "Existed_Raid", 00:15:18.543 "uuid": "433f7a8a-08d8-45b5-8d0c-045f470d6819", 00:15:18.543 "strip_size_kb": 64, 00:15:18.543 "state": "online", 00:15:18.543 "raid_level": "raid0", 00:15:18.543 "superblock": true, 00:15:18.543 "num_base_bdevs": 4, 00:15:18.543 "num_base_bdevs_discovered": 4, 00:15:18.543 "num_base_bdevs_operational": 4, 00:15:18.543 "base_bdevs_list": [ 00:15:18.543 { 00:15:18.543 "name": "BaseBdev1", 00:15:18.543 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:18.543 "is_configured": true, 00:15:18.543 "data_offset": 2048, 00:15:18.543 "data_size": 63488 00:15:18.543 }, 00:15:18.543 { 00:15:18.543 "name": "BaseBdev2", 00:15:18.543 "uuid": "8d51585a-5b6b-4745-a082-3f06e7a732a7", 00:15:18.543 "is_configured": true, 00:15:18.543 "data_offset": 2048, 00:15:18.543 "data_size": 63488 00:15:18.543 }, 00:15:18.543 { 00:15:18.543 "name": "BaseBdev3", 00:15:18.543 "uuid": "fc63f8fa-8533-401d-a8aa-f4cc352ac364", 00:15:18.543 "is_configured": true, 00:15:18.543 "data_offset": 2048, 00:15:18.543 "data_size": 63488 00:15:18.543 }, 00:15:18.543 { 00:15:18.543 "name": "BaseBdev4", 00:15:18.543 "uuid": "defb0952-1431-4ff2-92f8-b752ff9bc9de", 00:15:18.543 "is_configured": true, 00:15:18.543 "data_offset": 2048, 00:15:18.543 "data_size": 63488 00:15:18.543 } 00:15:18.543 ] 00:15:18.543 }' 00:15:18.543 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.544 13:38:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:19.110 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:19.110 [2024-07-15 13:38:06.706789] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:19.369 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:19.369 "name": "Existed_Raid", 00:15:19.369 "aliases": [ 00:15:19.369 "433f7a8a-08d8-45b5-8d0c-045f470d6819" 00:15:19.369 ], 00:15:19.369 "product_name": "Raid Volume", 00:15:19.369 "block_size": 512, 00:15:19.369 "num_blocks": 253952, 00:15:19.369 "uuid": "433f7a8a-08d8-45b5-8d0c-045f470d6819", 00:15:19.369 "assigned_rate_limits": { 00:15:19.369 "rw_ios_per_sec": 0, 00:15:19.369 "rw_mbytes_per_sec": 0, 00:15:19.369 "r_mbytes_per_sec": 0, 00:15:19.369 "w_mbytes_per_sec": 0 00:15:19.369 }, 00:15:19.369 "claimed": false, 00:15:19.369 "zoned": false, 00:15:19.369 "supported_io_types": { 00:15:19.369 "read": true, 00:15:19.369 "write": true, 00:15:19.369 "unmap": true, 00:15:19.369 "flush": true, 00:15:19.369 "reset": true, 00:15:19.369 "nvme_admin": false, 00:15:19.369 "nvme_io": false, 00:15:19.369 "nvme_io_md": false, 00:15:19.369 "write_zeroes": true, 00:15:19.369 "zcopy": false, 00:15:19.369 "get_zone_info": false, 00:15:19.369 "zone_management": false, 00:15:19.369 "zone_append": false, 00:15:19.369 "compare": false, 00:15:19.369 "compare_and_write": false, 00:15:19.369 "abort": false, 00:15:19.369 "seek_hole": false, 00:15:19.369 "seek_data": false, 00:15:19.369 "copy": false, 00:15:19.369 "nvme_iov_md": false 00:15:19.369 }, 00:15:19.369 "memory_domains": [ 00:15:19.369 { 00:15:19.369 "dma_device_id": "system", 00:15:19.369 "dma_device_type": 1 00:15:19.369 }, 00:15:19.369 { 00:15:19.369 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.369 "dma_device_type": 2 00:15:19.369 }, 00:15:19.369 { 00:15:19.369 "dma_device_id": "system", 00:15:19.369 "dma_device_type": 1 00:15:19.369 }, 00:15:19.369 { 00:15:19.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.370 "dma_device_type": 2 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "dma_device_id": "system", 00:15:19.370 "dma_device_type": 1 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.370 "dma_device_type": 2 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "dma_device_id": "system", 00:15:19.370 "dma_device_type": 1 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.370 "dma_device_type": 2 00:15:19.370 } 00:15:19.370 ], 00:15:19.370 "driver_specific": { 00:15:19.370 "raid": { 00:15:19.370 "uuid": "433f7a8a-08d8-45b5-8d0c-045f470d6819", 00:15:19.370 "strip_size_kb": 64, 00:15:19.370 "state": "online", 00:15:19.370 "raid_level": "raid0", 00:15:19.370 "superblock": true, 00:15:19.370 "num_base_bdevs": 4, 00:15:19.370 "num_base_bdevs_discovered": 4, 00:15:19.370 "num_base_bdevs_operational": 4, 00:15:19.370 "base_bdevs_list": [ 00:15:19.370 { 00:15:19.370 "name": "BaseBdev1", 00:15:19.370 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:19.370 "is_configured": true, 00:15:19.370 "data_offset": 2048, 00:15:19.370 "data_size": 63488 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "name": "BaseBdev2", 00:15:19.370 "uuid": "8d51585a-5b6b-4745-a082-3f06e7a732a7", 00:15:19.370 "is_configured": true, 00:15:19.370 "data_offset": 2048, 00:15:19.370 "data_size": 63488 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "name": "BaseBdev3", 00:15:19.370 "uuid": "fc63f8fa-8533-401d-a8aa-f4cc352ac364", 00:15:19.370 "is_configured": true, 00:15:19.370 "data_offset": 2048, 00:15:19.370 "data_size": 63488 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "name": "BaseBdev4", 00:15:19.370 "uuid": "defb0952-1431-4ff2-92f8-b752ff9bc9de", 00:15:19.370 "is_configured": true, 00:15:19.370 "data_offset": 2048, 00:15:19.370 "data_size": 63488 00:15:19.370 } 00:15:19.370 ] 00:15:19.370 } 00:15:19.370 } 00:15:19.370 }' 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:19.370 BaseBdev2 00:15:19.370 BaseBdev3 00:15:19.370 BaseBdev4' 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:19.370 "name": "BaseBdev1", 00:15:19.370 "aliases": [ 00:15:19.370 "c00f225a-3fa6-4bb2-84fe-6bd76457651b" 00:15:19.370 ], 00:15:19.370 "product_name": "Malloc disk", 00:15:19.370 "block_size": 512, 00:15:19.370 "num_blocks": 65536, 00:15:19.370 "uuid": "c00f225a-3fa6-4bb2-84fe-6bd76457651b", 00:15:19.370 "assigned_rate_limits": { 00:15:19.370 "rw_ios_per_sec": 0, 00:15:19.370 "rw_mbytes_per_sec": 0, 00:15:19.370 "r_mbytes_per_sec": 0, 00:15:19.370 "w_mbytes_per_sec": 0 00:15:19.370 }, 00:15:19.370 "claimed": true, 00:15:19.370 "claim_type": "exclusive_write", 00:15:19.370 "zoned": false, 00:15:19.370 "supported_io_types": { 00:15:19.370 "read": true, 00:15:19.370 "write": true, 00:15:19.370 "unmap": true, 00:15:19.370 "flush": true, 00:15:19.370 "reset": true, 00:15:19.370 "nvme_admin": false, 00:15:19.370 "nvme_io": false, 00:15:19.370 "nvme_io_md": false, 00:15:19.370 "write_zeroes": true, 00:15:19.370 "zcopy": true, 00:15:19.370 "get_zone_info": false, 00:15:19.370 "zone_management": false, 00:15:19.370 "zone_append": false, 00:15:19.370 "compare": false, 00:15:19.370 "compare_and_write": false, 00:15:19.370 "abort": true, 00:15:19.370 "seek_hole": false, 00:15:19.370 "seek_data": false, 00:15:19.370 "copy": true, 00:15:19.370 "nvme_iov_md": false 00:15:19.370 }, 00:15:19.370 "memory_domains": [ 00:15:19.370 { 00:15:19.370 "dma_device_id": "system", 00:15:19.370 "dma_device_type": 1 00:15:19.370 }, 00:15:19.370 { 00:15:19.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.370 "dma_device_type": 2 00:15:19.370 } 00:15:19.370 ], 00:15:19.370 "driver_specific": {} 00:15:19.370 }' 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.370 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.629 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:19.629 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:19.629 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:19.888 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:19.888 "name": "BaseBdev2", 00:15:19.888 "aliases": [ 00:15:19.888 "8d51585a-5b6b-4745-a082-3f06e7a732a7" 00:15:19.888 ], 00:15:19.888 "product_name": "Malloc disk", 00:15:19.888 "block_size": 512, 00:15:19.888 "num_blocks": 65536, 00:15:19.888 "uuid": "8d51585a-5b6b-4745-a082-3f06e7a732a7", 00:15:19.888 "assigned_rate_limits": { 00:15:19.888 "rw_ios_per_sec": 0, 00:15:19.888 "rw_mbytes_per_sec": 0, 00:15:19.888 "r_mbytes_per_sec": 0, 00:15:19.888 "w_mbytes_per_sec": 0 00:15:19.888 }, 00:15:19.888 "claimed": true, 00:15:19.888 "claim_type": "exclusive_write", 00:15:19.888 "zoned": false, 00:15:19.888 "supported_io_types": { 00:15:19.888 "read": true, 00:15:19.888 "write": true, 00:15:19.888 "unmap": true, 00:15:19.888 "flush": true, 00:15:19.888 "reset": true, 00:15:19.888 "nvme_admin": false, 00:15:19.888 "nvme_io": false, 00:15:19.888 "nvme_io_md": false, 00:15:19.888 "write_zeroes": true, 00:15:19.888 "zcopy": true, 00:15:19.888 "get_zone_info": false, 00:15:19.888 "zone_management": false, 00:15:19.888 "zone_append": false, 00:15:19.888 "compare": false, 00:15:19.888 "compare_and_write": false, 00:15:19.888 "abort": true, 00:15:19.888 "seek_hole": false, 00:15:19.888 "seek_data": false, 00:15:19.888 "copy": true, 00:15:19.888 "nvme_iov_md": false 00:15:19.888 }, 00:15:19.888 "memory_domains": [ 00:15:19.888 { 00:15:19.888 "dma_device_id": "system", 00:15:19.888 "dma_device_type": 1 00:15:19.888 }, 00:15:19.888 { 00:15:19.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.888 "dma_device_type": 2 00:15:19.888 } 00:15:19.888 ], 00:15:19.888 "driver_specific": {} 00:15:19.888 }' 00:15:19.888 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.888 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.889 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:19.889 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.889 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.889 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:19.889 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:20.147 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:20.407 "name": "BaseBdev3", 00:15:20.407 "aliases": [ 00:15:20.407 "fc63f8fa-8533-401d-a8aa-f4cc352ac364" 00:15:20.407 ], 00:15:20.407 "product_name": "Malloc disk", 00:15:20.407 "block_size": 512, 00:15:20.407 "num_blocks": 65536, 00:15:20.407 "uuid": "fc63f8fa-8533-401d-a8aa-f4cc352ac364", 00:15:20.407 "assigned_rate_limits": { 00:15:20.407 "rw_ios_per_sec": 0, 00:15:20.407 "rw_mbytes_per_sec": 0, 00:15:20.407 "r_mbytes_per_sec": 0, 00:15:20.407 "w_mbytes_per_sec": 0 00:15:20.407 }, 00:15:20.407 "claimed": true, 00:15:20.407 "claim_type": "exclusive_write", 00:15:20.407 "zoned": false, 00:15:20.407 "supported_io_types": { 00:15:20.407 "read": true, 00:15:20.407 "write": true, 00:15:20.407 "unmap": true, 00:15:20.407 "flush": true, 00:15:20.407 "reset": true, 00:15:20.407 "nvme_admin": false, 00:15:20.407 "nvme_io": false, 00:15:20.407 "nvme_io_md": false, 00:15:20.407 "write_zeroes": true, 00:15:20.407 "zcopy": true, 00:15:20.407 "get_zone_info": false, 00:15:20.407 "zone_management": false, 00:15:20.407 "zone_append": false, 00:15:20.407 "compare": false, 00:15:20.407 "compare_and_write": false, 00:15:20.407 "abort": true, 00:15:20.407 "seek_hole": false, 00:15:20.407 "seek_data": false, 00:15:20.407 "copy": true, 00:15:20.407 "nvme_iov_md": false 00:15:20.407 }, 00:15:20.407 "memory_domains": [ 00:15:20.407 { 00:15:20.407 "dma_device_id": "system", 00:15:20.407 "dma_device_type": 1 00:15:20.407 }, 00:15:20.407 { 00:15:20.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.407 "dma_device_type": 2 00:15:20.407 } 00:15:20.407 ], 00:15:20.407 "driver_specific": {} 00:15:20.407 }' 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.407 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:20.666 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:20.666 "name": "BaseBdev4", 00:15:20.666 "aliases": [ 00:15:20.666 "defb0952-1431-4ff2-92f8-b752ff9bc9de" 00:15:20.666 ], 00:15:20.666 "product_name": "Malloc disk", 00:15:20.666 "block_size": 512, 00:15:20.666 "num_blocks": 65536, 00:15:20.666 "uuid": "defb0952-1431-4ff2-92f8-b752ff9bc9de", 00:15:20.666 "assigned_rate_limits": { 00:15:20.666 "rw_ios_per_sec": 0, 00:15:20.666 "rw_mbytes_per_sec": 0, 00:15:20.666 "r_mbytes_per_sec": 0, 00:15:20.666 "w_mbytes_per_sec": 0 00:15:20.666 }, 00:15:20.666 "claimed": true, 00:15:20.666 "claim_type": "exclusive_write", 00:15:20.666 "zoned": false, 00:15:20.666 "supported_io_types": { 00:15:20.666 "read": true, 00:15:20.666 "write": true, 00:15:20.666 "unmap": true, 00:15:20.666 "flush": true, 00:15:20.666 "reset": true, 00:15:20.666 "nvme_admin": false, 00:15:20.666 "nvme_io": false, 00:15:20.666 "nvme_io_md": false, 00:15:20.666 "write_zeroes": true, 00:15:20.666 "zcopy": true, 00:15:20.666 "get_zone_info": false, 00:15:20.666 "zone_management": false, 00:15:20.666 "zone_append": false, 00:15:20.666 "compare": false, 00:15:20.666 "compare_and_write": false, 00:15:20.666 "abort": true, 00:15:20.666 "seek_hole": false, 00:15:20.666 "seek_data": false, 00:15:20.666 "copy": true, 00:15:20.666 "nvme_iov_md": false 00:15:20.666 }, 00:15:20.667 "memory_domains": [ 00:15:20.667 { 00:15:20.667 "dma_device_id": "system", 00:15:20.667 "dma_device_type": 1 00:15:20.667 }, 00:15:20.667 { 00:15:20.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.667 "dma_device_type": 2 00:15:20.667 } 00:15:20.667 ], 00:15:20.667 "driver_specific": {} 00:15:20.667 }' 00:15:20.667 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.926 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:21.185 [2024-07-15 13:38:08.727853] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:21.185 [2024-07-15 13:38:08.727876] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:21.185 [2024-07-15 13:38:08.727912] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.185 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.444 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.444 "name": "Existed_Raid", 00:15:21.444 "uuid": "433f7a8a-08d8-45b5-8d0c-045f470d6819", 00:15:21.444 "strip_size_kb": 64, 00:15:21.444 "state": "offline", 00:15:21.444 "raid_level": "raid0", 00:15:21.444 "superblock": true, 00:15:21.444 "num_base_bdevs": 4, 00:15:21.444 "num_base_bdevs_discovered": 3, 00:15:21.444 "num_base_bdevs_operational": 3, 00:15:21.444 "base_bdevs_list": [ 00:15:21.444 { 00:15:21.444 "name": null, 00:15:21.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.444 "is_configured": false, 00:15:21.444 "data_offset": 2048, 00:15:21.444 "data_size": 63488 00:15:21.444 }, 00:15:21.444 { 00:15:21.444 "name": "BaseBdev2", 00:15:21.444 "uuid": "8d51585a-5b6b-4745-a082-3f06e7a732a7", 00:15:21.444 "is_configured": true, 00:15:21.444 "data_offset": 2048, 00:15:21.444 "data_size": 63488 00:15:21.444 }, 00:15:21.444 { 00:15:21.444 "name": "BaseBdev3", 00:15:21.444 "uuid": "fc63f8fa-8533-401d-a8aa-f4cc352ac364", 00:15:21.444 "is_configured": true, 00:15:21.444 "data_offset": 2048, 00:15:21.444 "data_size": 63488 00:15:21.444 }, 00:15:21.444 { 00:15:21.444 "name": "BaseBdev4", 00:15:21.444 "uuid": "defb0952-1431-4ff2-92f8-b752ff9bc9de", 00:15:21.444 "is_configured": true, 00:15:21.444 "data_offset": 2048, 00:15:21.444 "data_size": 63488 00:15:21.444 } 00:15:21.444 ] 00:15:21.444 }' 00:15:21.444 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.444 13:38:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.012 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:22.012 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:22.012 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.012 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:22.012 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:22.012 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:22.012 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:22.271 [2024-07-15 13:38:09.743241] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:22.271 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:22.271 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:22.271 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.271 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:22.530 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:22.530 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:22.530 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:22.530 [2024-07-15 13:38:10.094370] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:22.530 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:22.530 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:22.530 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.530 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:22.789 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:22.789 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:22.789 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:23.048 [2024-07-15 13:38:10.445296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:23.048 [2024-07-15 13:38:10.445327] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228a840 name Existed_Raid, state offline 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:23.048 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:23.330 BaseBdev2 00:15:23.330 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:23.330 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:23.330 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:23.330 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:23.330 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:23.330 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:23.330 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:23.590 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:23.590 [ 00:15:23.590 { 00:15:23.590 "name": "BaseBdev2", 00:15:23.590 "aliases": [ 00:15:23.590 "5f5bcc72-9022-4975-8f87-3a156c37914f" 00:15:23.590 ], 00:15:23.590 "product_name": "Malloc disk", 00:15:23.590 "block_size": 512, 00:15:23.590 "num_blocks": 65536, 00:15:23.590 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:23.590 "assigned_rate_limits": { 00:15:23.590 "rw_ios_per_sec": 0, 00:15:23.590 "rw_mbytes_per_sec": 0, 00:15:23.590 "r_mbytes_per_sec": 0, 00:15:23.590 "w_mbytes_per_sec": 0 00:15:23.590 }, 00:15:23.590 "claimed": false, 00:15:23.590 "zoned": false, 00:15:23.590 "supported_io_types": { 00:15:23.590 "read": true, 00:15:23.590 "write": true, 00:15:23.590 "unmap": true, 00:15:23.590 "flush": true, 00:15:23.590 "reset": true, 00:15:23.590 "nvme_admin": false, 00:15:23.590 "nvme_io": false, 00:15:23.590 "nvme_io_md": false, 00:15:23.590 "write_zeroes": true, 00:15:23.590 "zcopy": true, 00:15:23.590 "get_zone_info": false, 00:15:23.590 "zone_management": false, 00:15:23.590 "zone_append": false, 00:15:23.590 "compare": false, 00:15:23.590 "compare_and_write": false, 00:15:23.590 "abort": true, 00:15:23.590 "seek_hole": false, 00:15:23.590 "seek_data": false, 00:15:23.590 "copy": true, 00:15:23.590 "nvme_iov_md": false 00:15:23.590 }, 00:15:23.590 "memory_domains": [ 00:15:23.590 { 00:15:23.590 "dma_device_id": "system", 00:15:23.590 "dma_device_type": 1 00:15:23.590 }, 00:15:23.590 { 00:15:23.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.590 "dma_device_type": 2 00:15:23.590 } 00:15:23.590 ], 00:15:23.590 "driver_specific": {} 00:15:23.590 } 00:15:23.590 ] 00:15:23.590 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:23.590 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:23.590 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:23.590 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:23.851 BaseBdev3 00:15:23.851 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:23.851 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:23.851 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:23.851 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:23.851 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:23.851 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:23.851 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.111 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:24.111 [ 00:15:24.111 { 00:15:24.111 "name": "BaseBdev3", 00:15:24.111 "aliases": [ 00:15:24.111 "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7" 00:15:24.111 ], 00:15:24.111 "product_name": "Malloc disk", 00:15:24.111 "block_size": 512, 00:15:24.111 "num_blocks": 65536, 00:15:24.111 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:24.111 "assigned_rate_limits": { 00:15:24.111 "rw_ios_per_sec": 0, 00:15:24.111 "rw_mbytes_per_sec": 0, 00:15:24.111 "r_mbytes_per_sec": 0, 00:15:24.111 "w_mbytes_per_sec": 0 00:15:24.111 }, 00:15:24.111 "claimed": false, 00:15:24.111 "zoned": false, 00:15:24.111 "supported_io_types": { 00:15:24.111 "read": true, 00:15:24.111 "write": true, 00:15:24.111 "unmap": true, 00:15:24.111 "flush": true, 00:15:24.111 "reset": true, 00:15:24.111 "nvme_admin": false, 00:15:24.111 "nvme_io": false, 00:15:24.111 "nvme_io_md": false, 00:15:24.111 "write_zeroes": true, 00:15:24.111 "zcopy": true, 00:15:24.111 "get_zone_info": false, 00:15:24.111 "zone_management": false, 00:15:24.111 "zone_append": false, 00:15:24.111 "compare": false, 00:15:24.111 "compare_and_write": false, 00:15:24.111 "abort": true, 00:15:24.111 "seek_hole": false, 00:15:24.111 "seek_data": false, 00:15:24.111 "copy": true, 00:15:24.111 "nvme_iov_md": false 00:15:24.111 }, 00:15:24.111 "memory_domains": [ 00:15:24.111 { 00:15:24.111 "dma_device_id": "system", 00:15:24.111 "dma_device_type": 1 00:15:24.111 }, 00:15:24.111 { 00:15:24.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.111 "dma_device_type": 2 00:15:24.111 } 00:15:24.111 ], 00:15:24.111 "driver_specific": {} 00:15:24.111 } 00:15:24.111 ] 00:15:24.111 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:24.111 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:24.111 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:24.111 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:24.369 BaseBdev4 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.369 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:24.627 [ 00:15:24.627 { 00:15:24.627 "name": "BaseBdev4", 00:15:24.627 "aliases": [ 00:15:24.627 "d990fe73-b737-48d6-809e-24f5f6915c04" 00:15:24.627 ], 00:15:24.627 "product_name": "Malloc disk", 00:15:24.627 "block_size": 512, 00:15:24.627 "num_blocks": 65536, 00:15:24.627 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:24.627 "assigned_rate_limits": { 00:15:24.627 "rw_ios_per_sec": 0, 00:15:24.627 "rw_mbytes_per_sec": 0, 00:15:24.627 "r_mbytes_per_sec": 0, 00:15:24.627 "w_mbytes_per_sec": 0 00:15:24.627 }, 00:15:24.627 "claimed": false, 00:15:24.627 "zoned": false, 00:15:24.627 "supported_io_types": { 00:15:24.627 "read": true, 00:15:24.627 "write": true, 00:15:24.627 "unmap": true, 00:15:24.627 "flush": true, 00:15:24.627 "reset": true, 00:15:24.627 "nvme_admin": false, 00:15:24.627 "nvme_io": false, 00:15:24.627 "nvme_io_md": false, 00:15:24.627 "write_zeroes": true, 00:15:24.627 "zcopy": true, 00:15:24.627 "get_zone_info": false, 00:15:24.627 "zone_management": false, 00:15:24.627 "zone_append": false, 00:15:24.627 "compare": false, 00:15:24.627 "compare_and_write": false, 00:15:24.627 "abort": true, 00:15:24.627 "seek_hole": false, 00:15:24.627 "seek_data": false, 00:15:24.627 "copy": true, 00:15:24.627 "nvme_iov_md": false 00:15:24.627 }, 00:15:24.627 "memory_domains": [ 00:15:24.627 { 00:15:24.627 "dma_device_id": "system", 00:15:24.627 "dma_device_type": 1 00:15:24.627 }, 00:15:24.627 { 00:15:24.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.627 "dma_device_type": 2 00:15:24.627 } 00:15:24.627 ], 00:15:24.627 "driver_specific": {} 00:15:24.627 } 00:15:24.627 ] 00:15:24.627 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:24.627 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:24.627 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:24.627 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:24.884 [2024-07-15 13:38:12.311399] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:24.884 [2024-07-15 13:38:12.311432] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:24.884 [2024-07-15 13:38:12.311445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:24.884 [2024-07-15 13:38:12.312464] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:24.884 [2024-07-15 13:38:12.312495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.884 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.142 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.142 "name": "Existed_Raid", 00:15:25.142 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:25.142 "strip_size_kb": 64, 00:15:25.142 "state": "configuring", 00:15:25.142 "raid_level": "raid0", 00:15:25.142 "superblock": true, 00:15:25.142 "num_base_bdevs": 4, 00:15:25.142 "num_base_bdevs_discovered": 3, 00:15:25.142 "num_base_bdevs_operational": 4, 00:15:25.142 "base_bdevs_list": [ 00:15:25.142 { 00:15:25.142 "name": "BaseBdev1", 00:15:25.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.142 "is_configured": false, 00:15:25.142 "data_offset": 0, 00:15:25.142 "data_size": 0 00:15:25.142 }, 00:15:25.142 { 00:15:25.142 "name": "BaseBdev2", 00:15:25.142 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:25.142 "is_configured": true, 00:15:25.142 "data_offset": 2048, 00:15:25.142 "data_size": 63488 00:15:25.142 }, 00:15:25.142 { 00:15:25.142 "name": "BaseBdev3", 00:15:25.142 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:25.142 "is_configured": true, 00:15:25.142 "data_offset": 2048, 00:15:25.142 "data_size": 63488 00:15:25.142 }, 00:15:25.142 { 00:15:25.142 "name": "BaseBdev4", 00:15:25.142 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:25.142 "is_configured": true, 00:15:25.142 "data_offset": 2048, 00:15:25.142 "data_size": 63488 00:15:25.142 } 00:15:25.142 ] 00:15:25.142 }' 00:15:25.142 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.142 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.400 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:25.658 [2024-07-15 13:38:13.133491] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.658 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.916 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.916 "name": "Existed_Raid", 00:15:25.916 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:25.916 "strip_size_kb": 64, 00:15:25.916 "state": "configuring", 00:15:25.916 "raid_level": "raid0", 00:15:25.916 "superblock": true, 00:15:25.916 "num_base_bdevs": 4, 00:15:25.916 "num_base_bdevs_discovered": 2, 00:15:25.916 "num_base_bdevs_operational": 4, 00:15:25.916 "base_bdevs_list": [ 00:15:25.916 { 00:15:25.916 "name": "BaseBdev1", 00:15:25.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.916 "is_configured": false, 00:15:25.916 "data_offset": 0, 00:15:25.916 "data_size": 0 00:15:25.916 }, 00:15:25.916 { 00:15:25.916 "name": null, 00:15:25.916 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:25.916 "is_configured": false, 00:15:25.916 "data_offset": 2048, 00:15:25.916 "data_size": 63488 00:15:25.916 }, 00:15:25.916 { 00:15:25.916 "name": "BaseBdev3", 00:15:25.916 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:25.916 "is_configured": true, 00:15:25.916 "data_offset": 2048, 00:15:25.916 "data_size": 63488 00:15:25.916 }, 00:15:25.916 { 00:15:25.916 "name": "BaseBdev4", 00:15:25.916 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:25.916 "is_configured": true, 00:15:25.916 "data_offset": 2048, 00:15:25.916 "data_size": 63488 00:15:25.916 } 00:15:25.916 ] 00:15:25.916 }' 00:15:25.916 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.916 13:38:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:26.175 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.175 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:26.433 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:26.434 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:26.692 [2024-07-15 13:38:14.116009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:26.692 BaseBdev1 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:26.692 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:26.951 [ 00:15:26.951 { 00:15:26.951 "name": "BaseBdev1", 00:15:26.951 "aliases": [ 00:15:26.951 "efacadbd-bb2d-4626-8825-b30ef7c056d3" 00:15:26.951 ], 00:15:26.951 "product_name": "Malloc disk", 00:15:26.951 "block_size": 512, 00:15:26.951 "num_blocks": 65536, 00:15:26.951 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:26.951 "assigned_rate_limits": { 00:15:26.951 "rw_ios_per_sec": 0, 00:15:26.951 "rw_mbytes_per_sec": 0, 00:15:26.951 "r_mbytes_per_sec": 0, 00:15:26.952 "w_mbytes_per_sec": 0 00:15:26.952 }, 00:15:26.952 "claimed": true, 00:15:26.952 "claim_type": "exclusive_write", 00:15:26.952 "zoned": false, 00:15:26.952 "supported_io_types": { 00:15:26.952 "read": true, 00:15:26.952 "write": true, 00:15:26.952 "unmap": true, 00:15:26.952 "flush": true, 00:15:26.952 "reset": true, 00:15:26.952 "nvme_admin": false, 00:15:26.952 "nvme_io": false, 00:15:26.952 "nvme_io_md": false, 00:15:26.952 "write_zeroes": true, 00:15:26.952 "zcopy": true, 00:15:26.952 "get_zone_info": false, 00:15:26.952 "zone_management": false, 00:15:26.952 "zone_append": false, 00:15:26.952 "compare": false, 00:15:26.952 "compare_and_write": false, 00:15:26.952 "abort": true, 00:15:26.952 "seek_hole": false, 00:15:26.952 "seek_data": false, 00:15:26.952 "copy": true, 00:15:26.952 "nvme_iov_md": false 00:15:26.952 }, 00:15:26.952 "memory_domains": [ 00:15:26.952 { 00:15:26.952 "dma_device_id": "system", 00:15:26.952 "dma_device_type": 1 00:15:26.952 }, 00:15:26.952 { 00:15:26.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.952 "dma_device_type": 2 00:15:26.952 } 00:15:26.952 ], 00:15:26.952 "driver_specific": {} 00:15:26.952 } 00:15:26.952 ] 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.952 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.211 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.211 "name": "Existed_Raid", 00:15:27.211 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:27.211 "strip_size_kb": 64, 00:15:27.211 "state": "configuring", 00:15:27.211 "raid_level": "raid0", 00:15:27.211 "superblock": true, 00:15:27.211 "num_base_bdevs": 4, 00:15:27.211 "num_base_bdevs_discovered": 3, 00:15:27.211 "num_base_bdevs_operational": 4, 00:15:27.211 "base_bdevs_list": [ 00:15:27.211 { 00:15:27.211 "name": "BaseBdev1", 00:15:27.211 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:27.211 "is_configured": true, 00:15:27.211 "data_offset": 2048, 00:15:27.211 "data_size": 63488 00:15:27.211 }, 00:15:27.211 { 00:15:27.211 "name": null, 00:15:27.211 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:27.211 "is_configured": false, 00:15:27.211 "data_offset": 2048, 00:15:27.211 "data_size": 63488 00:15:27.211 }, 00:15:27.211 { 00:15:27.211 "name": "BaseBdev3", 00:15:27.211 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:27.211 "is_configured": true, 00:15:27.211 "data_offset": 2048, 00:15:27.211 "data_size": 63488 00:15:27.211 }, 00:15:27.211 { 00:15:27.211 "name": "BaseBdev4", 00:15:27.211 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:27.211 "is_configured": true, 00:15:27.211 "data_offset": 2048, 00:15:27.211 "data_size": 63488 00:15:27.211 } 00:15:27.211 ] 00:15:27.211 }' 00:15:27.211 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.211 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.778 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.778 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:27.778 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:27.778 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:28.037 [2024-07-15 13:38:15.463491] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.037 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.296 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.296 "name": "Existed_Raid", 00:15:28.296 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:28.296 "strip_size_kb": 64, 00:15:28.296 "state": "configuring", 00:15:28.296 "raid_level": "raid0", 00:15:28.296 "superblock": true, 00:15:28.296 "num_base_bdevs": 4, 00:15:28.296 "num_base_bdevs_discovered": 2, 00:15:28.296 "num_base_bdevs_operational": 4, 00:15:28.296 "base_bdevs_list": [ 00:15:28.296 { 00:15:28.296 "name": "BaseBdev1", 00:15:28.296 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:28.296 "is_configured": true, 00:15:28.296 "data_offset": 2048, 00:15:28.296 "data_size": 63488 00:15:28.296 }, 00:15:28.296 { 00:15:28.296 "name": null, 00:15:28.296 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:28.296 "is_configured": false, 00:15:28.296 "data_offset": 2048, 00:15:28.296 "data_size": 63488 00:15:28.296 }, 00:15:28.296 { 00:15:28.296 "name": null, 00:15:28.296 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:28.296 "is_configured": false, 00:15:28.296 "data_offset": 2048, 00:15:28.296 "data_size": 63488 00:15:28.296 }, 00:15:28.296 { 00:15:28.296 "name": "BaseBdev4", 00:15:28.296 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:28.296 "is_configured": true, 00:15:28.296 "data_offset": 2048, 00:15:28.296 "data_size": 63488 00:15:28.296 } 00:15:28.296 ] 00:15:28.296 }' 00:15:28.296 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.296 13:38:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.554 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.554 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:28.813 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:28.813 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:29.072 [2024-07-15 13:38:16.470124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.072 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.072 "name": "Existed_Raid", 00:15:29.072 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:29.072 "strip_size_kb": 64, 00:15:29.072 "state": "configuring", 00:15:29.072 "raid_level": "raid0", 00:15:29.072 "superblock": true, 00:15:29.072 "num_base_bdevs": 4, 00:15:29.072 "num_base_bdevs_discovered": 3, 00:15:29.072 "num_base_bdevs_operational": 4, 00:15:29.072 "base_bdevs_list": [ 00:15:29.072 { 00:15:29.072 "name": "BaseBdev1", 00:15:29.072 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:29.072 "is_configured": true, 00:15:29.072 "data_offset": 2048, 00:15:29.072 "data_size": 63488 00:15:29.072 }, 00:15:29.072 { 00:15:29.072 "name": null, 00:15:29.072 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:29.072 "is_configured": false, 00:15:29.072 "data_offset": 2048, 00:15:29.072 "data_size": 63488 00:15:29.072 }, 00:15:29.073 { 00:15:29.073 "name": "BaseBdev3", 00:15:29.073 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:29.073 "is_configured": true, 00:15:29.073 "data_offset": 2048, 00:15:29.073 "data_size": 63488 00:15:29.073 }, 00:15:29.073 { 00:15:29.073 "name": "BaseBdev4", 00:15:29.073 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:29.073 "is_configured": true, 00:15:29.073 "data_offset": 2048, 00:15:29.073 "data_size": 63488 00:15:29.073 } 00:15:29.073 ] 00:15:29.073 }' 00:15:29.073 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.073 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:29.662 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:29.662 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:29.954 [2024-07-15 13:38:17.496887] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.954 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.213 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.213 "name": "Existed_Raid", 00:15:30.213 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:30.213 "strip_size_kb": 64, 00:15:30.213 "state": "configuring", 00:15:30.213 "raid_level": "raid0", 00:15:30.213 "superblock": true, 00:15:30.213 "num_base_bdevs": 4, 00:15:30.213 "num_base_bdevs_discovered": 2, 00:15:30.213 "num_base_bdevs_operational": 4, 00:15:30.213 "base_bdevs_list": [ 00:15:30.213 { 00:15:30.213 "name": null, 00:15:30.213 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:30.213 "is_configured": false, 00:15:30.213 "data_offset": 2048, 00:15:30.213 "data_size": 63488 00:15:30.213 }, 00:15:30.213 { 00:15:30.213 "name": null, 00:15:30.213 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:30.213 "is_configured": false, 00:15:30.213 "data_offset": 2048, 00:15:30.213 "data_size": 63488 00:15:30.213 }, 00:15:30.213 { 00:15:30.213 "name": "BaseBdev3", 00:15:30.213 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:30.213 "is_configured": true, 00:15:30.213 "data_offset": 2048, 00:15:30.213 "data_size": 63488 00:15:30.213 }, 00:15:30.213 { 00:15:30.213 "name": "BaseBdev4", 00:15:30.213 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:30.213 "is_configured": true, 00:15:30.213 "data_offset": 2048, 00:15:30.213 "data_size": 63488 00:15:30.213 } 00:15:30.213 ] 00:15:30.213 }' 00:15:30.213 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.213 13:38:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:30.780 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.780 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:30.780 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:30.780 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:31.040 [2024-07-15 13:38:18.517585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.040 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.299 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.299 "name": "Existed_Raid", 00:15:31.299 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:31.299 "strip_size_kb": 64, 00:15:31.299 "state": "configuring", 00:15:31.299 "raid_level": "raid0", 00:15:31.299 "superblock": true, 00:15:31.299 "num_base_bdevs": 4, 00:15:31.299 "num_base_bdevs_discovered": 3, 00:15:31.299 "num_base_bdevs_operational": 4, 00:15:31.299 "base_bdevs_list": [ 00:15:31.299 { 00:15:31.299 "name": null, 00:15:31.299 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:31.299 "is_configured": false, 00:15:31.299 "data_offset": 2048, 00:15:31.299 "data_size": 63488 00:15:31.299 }, 00:15:31.299 { 00:15:31.299 "name": "BaseBdev2", 00:15:31.299 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:31.299 "is_configured": true, 00:15:31.299 "data_offset": 2048, 00:15:31.299 "data_size": 63488 00:15:31.299 }, 00:15:31.299 { 00:15:31.299 "name": "BaseBdev3", 00:15:31.299 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:31.299 "is_configured": true, 00:15:31.299 "data_offset": 2048, 00:15:31.299 "data_size": 63488 00:15:31.299 }, 00:15:31.299 { 00:15:31.299 "name": "BaseBdev4", 00:15:31.299 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:31.299 "is_configured": true, 00:15:31.299 "data_offset": 2048, 00:15:31.299 "data_size": 63488 00:15:31.299 } 00:15:31.299 ] 00:15:31.299 }' 00:15:31.299 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.299 13:38:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:31.867 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:31.867 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.867 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:31.867 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.867 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u efacadbd-bb2d-4626-8825-b30ef7c056d3 00:15:32.126 [2024-07-15 13:38:19.704632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:32.126 [2024-07-15 13:38:19.704750] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x228e4a0 00:15:32.126 [2024-07-15 13:38:19.704758] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:32.126 [2024-07-15 13:38:19.704875] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2281fc0 00:15:32.126 [2024-07-15 13:38:19.704961] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228e4a0 00:15:32.126 [2024-07-15 13:38:19.704967] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x228e4a0 00:15:32.126 [2024-07-15 13:38:19.705052] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:32.126 NewBaseBdev 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:32.126 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:32.385 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:32.644 [ 00:15:32.644 { 00:15:32.644 "name": "NewBaseBdev", 00:15:32.644 "aliases": [ 00:15:32.644 "efacadbd-bb2d-4626-8825-b30ef7c056d3" 00:15:32.644 ], 00:15:32.644 "product_name": "Malloc disk", 00:15:32.644 "block_size": 512, 00:15:32.644 "num_blocks": 65536, 00:15:32.644 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:32.644 "assigned_rate_limits": { 00:15:32.644 "rw_ios_per_sec": 0, 00:15:32.644 "rw_mbytes_per_sec": 0, 00:15:32.644 "r_mbytes_per_sec": 0, 00:15:32.644 "w_mbytes_per_sec": 0 00:15:32.644 }, 00:15:32.644 "claimed": true, 00:15:32.644 "claim_type": "exclusive_write", 00:15:32.644 "zoned": false, 00:15:32.644 "supported_io_types": { 00:15:32.644 "read": true, 00:15:32.644 "write": true, 00:15:32.644 "unmap": true, 00:15:32.644 "flush": true, 00:15:32.644 "reset": true, 00:15:32.644 "nvme_admin": false, 00:15:32.644 "nvme_io": false, 00:15:32.644 "nvme_io_md": false, 00:15:32.644 "write_zeroes": true, 00:15:32.644 "zcopy": true, 00:15:32.644 "get_zone_info": false, 00:15:32.644 "zone_management": false, 00:15:32.644 "zone_append": false, 00:15:32.644 "compare": false, 00:15:32.644 "compare_and_write": false, 00:15:32.644 "abort": true, 00:15:32.644 "seek_hole": false, 00:15:32.644 "seek_data": false, 00:15:32.644 "copy": true, 00:15:32.644 "nvme_iov_md": false 00:15:32.644 }, 00:15:32.644 "memory_domains": [ 00:15:32.644 { 00:15:32.644 "dma_device_id": "system", 00:15:32.644 "dma_device_type": 1 00:15:32.644 }, 00:15:32.644 { 00:15:32.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.644 "dma_device_type": 2 00:15:32.644 } 00:15:32.644 ], 00:15:32.644 "driver_specific": {} 00:15:32.644 } 00:15:32.644 ] 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.644 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.644 "name": "Existed_Raid", 00:15:32.644 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:32.644 "strip_size_kb": 64, 00:15:32.644 "state": "online", 00:15:32.644 "raid_level": "raid0", 00:15:32.644 "superblock": true, 00:15:32.644 "num_base_bdevs": 4, 00:15:32.644 "num_base_bdevs_discovered": 4, 00:15:32.644 "num_base_bdevs_operational": 4, 00:15:32.644 "base_bdevs_list": [ 00:15:32.644 { 00:15:32.644 "name": "NewBaseBdev", 00:15:32.644 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:32.644 "is_configured": true, 00:15:32.644 "data_offset": 2048, 00:15:32.644 "data_size": 63488 00:15:32.644 }, 00:15:32.644 { 00:15:32.644 "name": "BaseBdev2", 00:15:32.644 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:32.644 "is_configured": true, 00:15:32.644 "data_offset": 2048, 00:15:32.644 "data_size": 63488 00:15:32.644 }, 00:15:32.644 { 00:15:32.644 "name": "BaseBdev3", 00:15:32.644 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:32.644 "is_configured": true, 00:15:32.644 "data_offset": 2048, 00:15:32.644 "data_size": 63488 00:15:32.644 }, 00:15:32.645 { 00:15:32.645 "name": "BaseBdev4", 00:15:32.645 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:32.645 "is_configured": true, 00:15:32.645 "data_offset": 2048, 00:15:32.645 "data_size": 63488 00:15:32.645 } 00:15:32.645 ] 00:15:32.645 }' 00:15:32.645 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.645 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:33.212 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:33.471 [2024-07-15 13:38:20.892048] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.471 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:33.471 "name": "Existed_Raid", 00:15:33.471 "aliases": [ 00:15:33.471 "3a0728ec-4c47-41b7-8e91-d5cf160fa258" 00:15:33.471 ], 00:15:33.471 "product_name": "Raid Volume", 00:15:33.471 "block_size": 512, 00:15:33.471 "num_blocks": 253952, 00:15:33.471 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:33.471 "assigned_rate_limits": { 00:15:33.471 "rw_ios_per_sec": 0, 00:15:33.471 "rw_mbytes_per_sec": 0, 00:15:33.471 "r_mbytes_per_sec": 0, 00:15:33.471 "w_mbytes_per_sec": 0 00:15:33.471 }, 00:15:33.471 "claimed": false, 00:15:33.471 "zoned": false, 00:15:33.471 "supported_io_types": { 00:15:33.471 "read": true, 00:15:33.471 "write": true, 00:15:33.471 "unmap": true, 00:15:33.471 "flush": true, 00:15:33.471 "reset": true, 00:15:33.471 "nvme_admin": false, 00:15:33.471 "nvme_io": false, 00:15:33.471 "nvme_io_md": false, 00:15:33.471 "write_zeroes": true, 00:15:33.471 "zcopy": false, 00:15:33.471 "get_zone_info": false, 00:15:33.471 "zone_management": false, 00:15:33.471 "zone_append": false, 00:15:33.471 "compare": false, 00:15:33.471 "compare_and_write": false, 00:15:33.471 "abort": false, 00:15:33.471 "seek_hole": false, 00:15:33.471 "seek_data": false, 00:15:33.471 "copy": false, 00:15:33.471 "nvme_iov_md": false 00:15:33.471 }, 00:15:33.471 "memory_domains": [ 00:15:33.471 { 00:15:33.471 "dma_device_id": "system", 00:15:33.471 "dma_device_type": 1 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.471 "dma_device_type": 2 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "dma_device_id": "system", 00:15:33.471 "dma_device_type": 1 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.471 "dma_device_type": 2 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "dma_device_id": "system", 00:15:33.471 "dma_device_type": 1 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.471 "dma_device_type": 2 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "dma_device_id": "system", 00:15:33.471 "dma_device_type": 1 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.471 "dma_device_type": 2 00:15:33.471 } 00:15:33.471 ], 00:15:33.471 "driver_specific": { 00:15:33.471 "raid": { 00:15:33.471 "uuid": "3a0728ec-4c47-41b7-8e91-d5cf160fa258", 00:15:33.471 "strip_size_kb": 64, 00:15:33.471 "state": "online", 00:15:33.471 "raid_level": "raid0", 00:15:33.471 "superblock": true, 00:15:33.471 "num_base_bdevs": 4, 00:15:33.471 "num_base_bdevs_discovered": 4, 00:15:33.471 "num_base_bdevs_operational": 4, 00:15:33.471 "base_bdevs_list": [ 00:15:33.471 { 00:15:33.471 "name": "NewBaseBdev", 00:15:33.471 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:33.471 "is_configured": true, 00:15:33.471 "data_offset": 2048, 00:15:33.471 "data_size": 63488 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "name": "BaseBdev2", 00:15:33.471 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:33.471 "is_configured": true, 00:15:33.471 "data_offset": 2048, 00:15:33.471 "data_size": 63488 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "name": "BaseBdev3", 00:15:33.471 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:33.471 "is_configured": true, 00:15:33.471 "data_offset": 2048, 00:15:33.471 "data_size": 63488 00:15:33.471 }, 00:15:33.471 { 00:15:33.471 "name": "BaseBdev4", 00:15:33.471 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:33.471 "is_configured": true, 00:15:33.471 "data_offset": 2048, 00:15:33.471 "data_size": 63488 00:15:33.471 } 00:15:33.471 ] 00:15:33.471 } 00:15:33.471 } 00:15:33.471 }' 00:15:33.471 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:33.471 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:33.471 BaseBdev2 00:15:33.471 BaseBdev3 00:15:33.471 BaseBdev4' 00:15:33.471 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.471 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:33.471 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.730 "name": "NewBaseBdev", 00:15:33.730 "aliases": [ 00:15:33.730 "efacadbd-bb2d-4626-8825-b30ef7c056d3" 00:15:33.730 ], 00:15:33.730 "product_name": "Malloc disk", 00:15:33.730 "block_size": 512, 00:15:33.730 "num_blocks": 65536, 00:15:33.730 "uuid": "efacadbd-bb2d-4626-8825-b30ef7c056d3", 00:15:33.730 "assigned_rate_limits": { 00:15:33.730 "rw_ios_per_sec": 0, 00:15:33.730 "rw_mbytes_per_sec": 0, 00:15:33.730 "r_mbytes_per_sec": 0, 00:15:33.730 "w_mbytes_per_sec": 0 00:15:33.730 }, 00:15:33.730 "claimed": true, 00:15:33.730 "claim_type": "exclusive_write", 00:15:33.730 "zoned": false, 00:15:33.730 "supported_io_types": { 00:15:33.730 "read": true, 00:15:33.730 "write": true, 00:15:33.730 "unmap": true, 00:15:33.730 "flush": true, 00:15:33.730 "reset": true, 00:15:33.730 "nvme_admin": false, 00:15:33.730 "nvme_io": false, 00:15:33.730 "nvme_io_md": false, 00:15:33.730 "write_zeroes": true, 00:15:33.730 "zcopy": true, 00:15:33.730 "get_zone_info": false, 00:15:33.730 "zone_management": false, 00:15:33.730 "zone_append": false, 00:15:33.730 "compare": false, 00:15:33.730 "compare_and_write": false, 00:15:33.730 "abort": true, 00:15:33.730 "seek_hole": false, 00:15:33.730 "seek_data": false, 00:15:33.730 "copy": true, 00:15:33.730 "nvme_iov_md": false 00:15:33.730 }, 00:15:33.730 "memory_domains": [ 00:15:33.730 { 00:15:33.730 "dma_device_id": "system", 00:15:33.730 "dma_device_type": 1 00:15:33.730 }, 00:15:33.730 { 00:15:33.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.730 "dma_device_type": 2 00:15:33.730 } 00:15:33.730 ], 00:15:33.730 "driver_specific": {} 00:15:33.730 }' 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.730 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.989 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.989 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.989 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.989 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:33.989 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.248 "name": "BaseBdev2", 00:15:34.248 "aliases": [ 00:15:34.248 "5f5bcc72-9022-4975-8f87-3a156c37914f" 00:15:34.248 ], 00:15:34.248 "product_name": "Malloc disk", 00:15:34.248 "block_size": 512, 00:15:34.248 "num_blocks": 65536, 00:15:34.248 "uuid": "5f5bcc72-9022-4975-8f87-3a156c37914f", 00:15:34.248 "assigned_rate_limits": { 00:15:34.248 "rw_ios_per_sec": 0, 00:15:34.248 "rw_mbytes_per_sec": 0, 00:15:34.248 "r_mbytes_per_sec": 0, 00:15:34.248 "w_mbytes_per_sec": 0 00:15:34.248 }, 00:15:34.248 "claimed": true, 00:15:34.248 "claim_type": "exclusive_write", 00:15:34.248 "zoned": false, 00:15:34.248 "supported_io_types": { 00:15:34.248 "read": true, 00:15:34.248 "write": true, 00:15:34.248 "unmap": true, 00:15:34.248 "flush": true, 00:15:34.248 "reset": true, 00:15:34.248 "nvme_admin": false, 00:15:34.248 "nvme_io": false, 00:15:34.248 "nvme_io_md": false, 00:15:34.248 "write_zeroes": true, 00:15:34.248 "zcopy": true, 00:15:34.248 "get_zone_info": false, 00:15:34.248 "zone_management": false, 00:15:34.248 "zone_append": false, 00:15:34.248 "compare": false, 00:15:34.248 "compare_and_write": false, 00:15:34.248 "abort": true, 00:15:34.248 "seek_hole": false, 00:15:34.248 "seek_data": false, 00:15:34.248 "copy": true, 00:15:34.248 "nvme_iov_md": false 00:15:34.248 }, 00:15:34.248 "memory_domains": [ 00:15:34.248 { 00:15:34.248 "dma_device_id": "system", 00:15:34.248 "dma_device_type": 1 00:15:34.248 }, 00:15:34.248 { 00:15:34.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.248 "dma_device_type": 2 00:15:34.248 } 00:15:34.248 ], 00:15:34.248 "driver_specific": {} 00:15:34.248 }' 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.248 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.506 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.506 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.506 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.506 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.506 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:34.506 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:34.507 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.507 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.507 "name": "BaseBdev3", 00:15:34.507 "aliases": [ 00:15:34.507 "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7" 00:15:34.507 ], 00:15:34.507 "product_name": "Malloc disk", 00:15:34.507 "block_size": 512, 00:15:34.507 "num_blocks": 65536, 00:15:34.507 "uuid": "e5348dd9-0c6f-4c1c-8a67-e061c909c9b7", 00:15:34.507 "assigned_rate_limits": { 00:15:34.507 "rw_ios_per_sec": 0, 00:15:34.507 "rw_mbytes_per_sec": 0, 00:15:34.507 "r_mbytes_per_sec": 0, 00:15:34.507 "w_mbytes_per_sec": 0 00:15:34.507 }, 00:15:34.507 "claimed": true, 00:15:34.507 "claim_type": "exclusive_write", 00:15:34.507 "zoned": false, 00:15:34.507 "supported_io_types": { 00:15:34.507 "read": true, 00:15:34.507 "write": true, 00:15:34.507 "unmap": true, 00:15:34.507 "flush": true, 00:15:34.507 "reset": true, 00:15:34.507 "nvme_admin": false, 00:15:34.507 "nvme_io": false, 00:15:34.507 "nvme_io_md": false, 00:15:34.507 "write_zeroes": true, 00:15:34.507 "zcopy": true, 00:15:34.507 "get_zone_info": false, 00:15:34.507 "zone_management": false, 00:15:34.507 "zone_append": false, 00:15:34.507 "compare": false, 00:15:34.507 "compare_and_write": false, 00:15:34.507 "abort": true, 00:15:34.507 "seek_hole": false, 00:15:34.507 "seek_data": false, 00:15:34.507 "copy": true, 00:15:34.507 "nvme_iov_md": false 00:15:34.507 }, 00:15:34.507 "memory_domains": [ 00:15:34.507 { 00:15:34.507 "dma_device_id": "system", 00:15:34.507 "dma_device_type": 1 00:15:34.507 }, 00:15:34.507 { 00:15:34.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.507 "dma_device_type": 2 00:15:34.507 } 00:15:34.507 ], 00:15:34.507 "driver_specific": {} 00:15:34.507 }' 00:15:34.507 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.765 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.765 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.765 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.765 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.766 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.766 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.766 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.766 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.766 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.766 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.024 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:35.024 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:35.024 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:35.024 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:35.024 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:35.024 "name": "BaseBdev4", 00:15:35.024 "aliases": [ 00:15:35.024 "d990fe73-b737-48d6-809e-24f5f6915c04" 00:15:35.024 ], 00:15:35.024 "product_name": "Malloc disk", 00:15:35.024 "block_size": 512, 00:15:35.024 "num_blocks": 65536, 00:15:35.024 "uuid": "d990fe73-b737-48d6-809e-24f5f6915c04", 00:15:35.024 "assigned_rate_limits": { 00:15:35.024 "rw_ios_per_sec": 0, 00:15:35.024 "rw_mbytes_per_sec": 0, 00:15:35.024 "r_mbytes_per_sec": 0, 00:15:35.024 "w_mbytes_per_sec": 0 00:15:35.024 }, 00:15:35.024 "claimed": true, 00:15:35.024 "claim_type": "exclusive_write", 00:15:35.024 "zoned": false, 00:15:35.024 "supported_io_types": { 00:15:35.024 "read": true, 00:15:35.024 "write": true, 00:15:35.024 "unmap": true, 00:15:35.024 "flush": true, 00:15:35.024 "reset": true, 00:15:35.024 "nvme_admin": false, 00:15:35.024 "nvme_io": false, 00:15:35.024 "nvme_io_md": false, 00:15:35.024 "write_zeroes": true, 00:15:35.024 "zcopy": true, 00:15:35.024 "get_zone_info": false, 00:15:35.024 "zone_management": false, 00:15:35.024 "zone_append": false, 00:15:35.024 "compare": false, 00:15:35.024 "compare_and_write": false, 00:15:35.024 "abort": true, 00:15:35.024 "seek_hole": false, 00:15:35.024 "seek_data": false, 00:15:35.024 "copy": true, 00:15:35.024 "nvme_iov_md": false 00:15:35.024 }, 00:15:35.024 "memory_domains": [ 00:15:35.024 { 00:15:35.024 "dma_device_id": "system", 00:15:35.024 "dma_device_type": 1 00:15:35.024 }, 00:15:35.024 { 00:15:35.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.024 "dma_device_type": 2 00:15:35.024 } 00:15:35.024 ], 00:15:35.024 "driver_specific": {} 00:15:35.024 }' 00:15:35.024 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.024 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:35.282 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:35.541 [2024-07-15 13:38:23.025389] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:35.541 [2024-07-15 13:38:23.025412] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:35.541 [2024-07-15 13:38:23.025454] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:35.541 [2024-07-15 13:38:23.025493] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:35.541 [2024-07-15 13:38:23.025502] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228e4a0 name Existed_Raid, state offline 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 25248 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 25248 ']' 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 25248 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 25248 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 25248' 00:15:35.541 killing process with pid 25248 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 25248 00:15:35.541 [2024-07-15 13:38:23.086583] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:35.541 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 25248 00:15:35.541 [2024-07-15 13:38:23.128446] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:35.798 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:35.798 00:15:35.798 real 0m24.597s 00:15:35.798 user 0m44.972s 00:15:35.798 sys 0m4.664s 00:15:35.798 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:35.798 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:35.798 ************************************ 00:15:35.798 END TEST raid_state_function_test_sb 00:15:35.798 ************************************ 00:15:35.798 13:38:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:35.798 13:38:23 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:15:35.798 13:38:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:35.798 13:38:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:35.798 13:38:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:35.798 ************************************ 00:15:35.798 START TEST raid_superblock_test 00:15:35.798 ************************************ 00:15:35.798 13:38:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:15:35.798 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:15:35.798 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:15:35.798 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:35.798 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:35.798 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=29166 00:15:35.799 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 29166 /var/tmp/spdk-raid.sock 00:15:36.057 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:36.057 13:38:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 29166 ']' 00:15:36.057 13:38:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:36.057 13:38:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:36.057 13:38:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:36.057 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:36.057 13:38:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:36.057 13:38:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.057 [2024-07-15 13:38:23.464751] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:15:36.057 [2024-07-15 13:38:23.464797] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid29166 ] 00:15:36.057 [2024-07-15 13:38:23.552040] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:36.057 [2024-07-15 13:38:23.642953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:36.315 [2024-07-15 13:38:23.697142] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:36.315 [2024-07-15 13:38:23.697170] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:36.884 malloc1 00:15:36.884 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:37.143 [2024-07-15 13:38:24.620341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:37.143 [2024-07-15 13:38:24.620381] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.143 [2024-07-15 13:38:24.620396] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x130c260 00:15:37.143 [2024-07-15 13:38:24.620405] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.143 [2024-07-15 13:38:24.621714] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.143 [2024-07-15 13:38:24.621738] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:37.143 pt1 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:37.143 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:37.402 malloc2 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:37.402 [2024-07-15 13:38:24.962260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:37.402 [2024-07-15 13:38:24.962296] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.402 [2024-07-15 13:38:24.962308] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b6310 00:15:37.402 [2024-07-15 13:38:24.962332] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.402 [2024-07-15 13:38:24.963521] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.402 [2024-07-15 13:38:24.963543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:37.402 pt2 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:37.402 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:37.661 malloc3 00:15:37.661 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:37.919 [2024-07-15 13:38:25.316097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:37.919 [2024-07-15 13:38:25.316131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.919 [2024-07-15 13:38:25.316159] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b9e70 00:15:37.919 [2024-07-15 13:38:25.316167] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.919 [2024-07-15 13:38:25.317294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.919 [2024-07-15 13:38:25.317316] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:37.919 pt3 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:15:37.919 malloc4 00:15:37.919 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:38.178 [2024-07-15 13:38:25.672685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:38.178 [2024-07-15 13:38:25.672723] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:38.178 [2024-07-15 13:38:25.672751] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b6d40 00:15:38.178 [2024-07-15 13:38:25.672759] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:38.178 [2024-07-15 13:38:25.673935] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:38.178 [2024-07-15 13:38:25.673958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:38.178 pt4 00:15:38.178 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:38.178 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:38.178 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:15:38.436 [2024-07-15 13:38:25.849179] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:38.436 [2024-07-15 13:38:25.850148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:38.436 [2024-07-15 13:38:25.850188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:38.436 [2024-07-15 13:38:25.850216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:38.436 [2024-07-15 13:38:25.850331] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14ba180 00:15:38.436 [2024-07-15 13:38:25.850338] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:38.436 [2024-07-15 13:38:25.850482] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14be580 00:15:38.436 [2024-07-15 13:38:25.850583] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14ba180 00:15:38.436 [2024-07-15 13:38:25.850589] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14ba180 00:15:38.436 [2024-07-15 13:38:25.850657] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:38.436 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:38.436 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:38.436 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:38.436 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.437 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:38.437 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.437 "name": "raid_bdev1", 00:15:38.437 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:38.437 "strip_size_kb": 64, 00:15:38.437 "state": "online", 00:15:38.437 "raid_level": "raid0", 00:15:38.437 "superblock": true, 00:15:38.437 "num_base_bdevs": 4, 00:15:38.437 "num_base_bdevs_discovered": 4, 00:15:38.437 "num_base_bdevs_operational": 4, 00:15:38.437 "base_bdevs_list": [ 00:15:38.437 { 00:15:38.437 "name": "pt1", 00:15:38.437 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.437 "is_configured": true, 00:15:38.437 "data_offset": 2048, 00:15:38.437 "data_size": 63488 00:15:38.437 }, 00:15:38.437 { 00:15:38.437 "name": "pt2", 00:15:38.437 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.437 "is_configured": true, 00:15:38.437 "data_offset": 2048, 00:15:38.437 "data_size": 63488 00:15:38.437 }, 00:15:38.437 { 00:15:38.437 "name": "pt3", 00:15:38.437 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.437 "is_configured": true, 00:15:38.437 "data_offset": 2048, 00:15:38.437 "data_size": 63488 00:15:38.437 }, 00:15:38.437 { 00:15:38.437 "name": "pt4", 00:15:38.437 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:38.437 "is_configured": true, 00:15:38.437 "data_offset": 2048, 00:15:38.437 "data_size": 63488 00:15:38.437 } 00:15:38.437 ] 00:15:38.437 }' 00:15:38.437 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.437 13:38:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:39.003 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:39.262 [2024-07-15 13:38:26.691509] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:39.262 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:39.262 "name": "raid_bdev1", 00:15:39.262 "aliases": [ 00:15:39.262 "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac" 00:15:39.262 ], 00:15:39.262 "product_name": "Raid Volume", 00:15:39.262 "block_size": 512, 00:15:39.262 "num_blocks": 253952, 00:15:39.262 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:39.262 "assigned_rate_limits": { 00:15:39.262 "rw_ios_per_sec": 0, 00:15:39.262 "rw_mbytes_per_sec": 0, 00:15:39.262 "r_mbytes_per_sec": 0, 00:15:39.262 "w_mbytes_per_sec": 0 00:15:39.262 }, 00:15:39.262 "claimed": false, 00:15:39.262 "zoned": false, 00:15:39.262 "supported_io_types": { 00:15:39.262 "read": true, 00:15:39.262 "write": true, 00:15:39.262 "unmap": true, 00:15:39.262 "flush": true, 00:15:39.262 "reset": true, 00:15:39.262 "nvme_admin": false, 00:15:39.262 "nvme_io": false, 00:15:39.262 "nvme_io_md": false, 00:15:39.262 "write_zeroes": true, 00:15:39.262 "zcopy": false, 00:15:39.262 "get_zone_info": false, 00:15:39.262 "zone_management": false, 00:15:39.262 "zone_append": false, 00:15:39.262 "compare": false, 00:15:39.262 "compare_and_write": false, 00:15:39.262 "abort": false, 00:15:39.262 "seek_hole": false, 00:15:39.262 "seek_data": false, 00:15:39.262 "copy": false, 00:15:39.262 "nvme_iov_md": false 00:15:39.262 }, 00:15:39.262 "memory_domains": [ 00:15:39.262 { 00:15:39.262 "dma_device_id": "system", 00:15:39.262 "dma_device_type": 1 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.262 "dma_device_type": 2 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "dma_device_id": "system", 00:15:39.262 "dma_device_type": 1 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.262 "dma_device_type": 2 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "dma_device_id": "system", 00:15:39.262 "dma_device_type": 1 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.262 "dma_device_type": 2 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "dma_device_id": "system", 00:15:39.262 "dma_device_type": 1 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.262 "dma_device_type": 2 00:15:39.262 } 00:15:39.262 ], 00:15:39.262 "driver_specific": { 00:15:39.262 "raid": { 00:15:39.262 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:39.262 "strip_size_kb": 64, 00:15:39.262 "state": "online", 00:15:39.262 "raid_level": "raid0", 00:15:39.262 "superblock": true, 00:15:39.262 "num_base_bdevs": 4, 00:15:39.262 "num_base_bdevs_discovered": 4, 00:15:39.262 "num_base_bdevs_operational": 4, 00:15:39.262 "base_bdevs_list": [ 00:15:39.262 { 00:15:39.262 "name": "pt1", 00:15:39.262 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:39.262 "is_configured": true, 00:15:39.262 "data_offset": 2048, 00:15:39.262 "data_size": 63488 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "name": "pt2", 00:15:39.262 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.262 "is_configured": true, 00:15:39.262 "data_offset": 2048, 00:15:39.262 "data_size": 63488 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "name": "pt3", 00:15:39.262 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:39.262 "is_configured": true, 00:15:39.262 "data_offset": 2048, 00:15:39.262 "data_size": 63488 00:15:39.262 }, 00:15:39.262 { 00:15:39.262 "name": "pt4", 00:15:39.262 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:39.262 "is_configured": true, 00:15:39.262 "data_offset": 2048, 00:15:39.262 "data_size": 63488 00:15:39.262 } 00:15:39.262 ] 00:15:39.262 } 00:15:39.262 } 00:15:39.262 }' 00:15:39.262 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:39.262 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:39.262 pt2 00:15:39.262 pt3 00:15:39.262 pt4' 00:15:39.262 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.262 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:39.262 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.520 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.520 "name": "pt1", 00:15:39.520 "aliases": [ 00:15:39.520 "00000000-0000-0000-0000-000000000001" 00:15:39.520 ], 00:15:39.520 "product_name": "passthru", 00:15:39.520 "block_size": 512, 00:15:39.520 "num_blocks": 65536, 00:15:39.520 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:39.520 "assigned_rate_limits": { 00:15:39.520 "rw_ios_per_sec": 0, 00:15:39.520 "rw_mbytes_per_sec": 0, 00:15:39.520 "r_mbytes_per_sec": 0, 00:15:39.520 "w_mbytes_per_sec": 0 00:15:39.520 }, 00:15:39.520 "claimed": true, 00:15:39.520 "claim_type": "exclusive_write", 00:15:39.520 "zoned": false, 00:15:39.520 "supported_io_types": { 00:15:39.520 "read": true, 00:15:39.520 "write": true, 00:15:39.520 "unmap": true, 00:15:39.520 "flush": true, 00:15:39.520 "reset": true, 00:15:39.520 "nvme_admin": false, 00:15:39.520 "nvme_io": false, 00:15:39.520 "nvme_io_md": false, 00:15:39.520 "write_zeroes": true, 00:15:39.520 "zcopy": true, 00:15:39.520 "get_zone_info": false, 00:15:39.520 "zone_management": false, 00:15:39.520 "zone_append": false, 00:15:39.520 "compare": false, 00:15:39.520 "compare_and_write": false, 00:15:39.520 "abort": true, 00:15:39.520 "seek_hole": false, 00:15:39.520 "seek_data": false, 00:15:39.520 "copy": true, 00:15:39.520 "nvme_iov_md": false 00:15:39.520 }, 00:15:39.520 "memory_domains": [ 00:15:39.520 { 00:15:39.520 "dma_device_id": "system", 00:15:39.520 "dma_device_type": 1 00:15:39.520 }, 00:15:39.520 { 00:15:39.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.520 "dma_device_type": 2 00:15:39.520 } 00:15:39.520 ], 00:15:39.520 "driver_specific": { 00:15:39.520 "passthru": { 00:15:39.520 "name": "pt1", 00:15:39.520 "base_bdev_name": "malloc1" 00:15:39.520 } 00:15:39.520 } 00:15:39.520 }' 00:15:39.520 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.520 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.520 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.520 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.520 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.520 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.520 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.520 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.778 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.778 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.778 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.778 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.778 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.778 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.778 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.036 "name": "pt2", 00:15:40.036 "aliases": [ 00:15:40.036 "00000000-0000-0000-0000-000000000002" 00:15:40.036 ], 00:15:40.036 "product_name": "passthru", 00:15:40.036 "block_size": 512, 00:15:40.036 "num_blocks": 65536, 00:15:40.036 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:40.036 "assigned_rate_limits": { 00:15:40.036 "rw_ios_per_sec": 0, 00:15:40.036 "rw_mbytes_per_sec": 0, 00:15:40.036 "r_mbytes_per_sec": 0, 00:15:40.036 "w_mbytes_per_sec": 0 00:15:40.036 }, 00:15:40.036 "claimed": true, 00:15:40.036 "claim_type": "exclusive_write", 00:15:40.036 "zoned": false, 00:15:40.036 "supported_io_types": { 00:15:40.036 "read": true, 00:15:40.036 "write": true, 00:15:40.036 "unmap": true, 00:15:40.036 "flush": true, 00:15:40.036 "reset": true, 00:15:40.036 "nvme_admin": false, 00:15:40.036 "nvme_io": false, 00:15:40.036 "nvme_io_md": false, 00:15:40.036 "write_zeroes": true, 00:15:40.036 "zcopy": true, 00:15:40.036 "get_zone_info": false, 00:15:40.036 "zone_management": false, 00:15:40.036 "zone_append": false, 00:15:40.036 "compare": false, 00:15:40.036 "compare_and_write": false, 00:15:40.036 "abort": true, 00:15:40.036 "seek_hole": false, 00:15:40.036 "seek_data": false, 00:15:40.036 "copy": true, 00:15:40.036 "nvme_iov_md": false 00:15:40.036 }, 00:15:40.036 "memory_domains": [ 00:15:40.036 { 00:15:40.036 "dma_device_id": "system", 00:15:40.036 "dma_device_type": 1 00:15:40.036 }, 00:15:40.036 { 00:15:40.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.036 "dma_device_type": 2 00:15:40.036 } 00:15:40.036 ], 00:15:40.036 "driver_specific": { 00:15:40.036 "passthru": { 00:15:40.036 "name": "pt2", 00:15:40.036 "base_bdev_name": "malloc2" 00:15:40.036 } 00:15:40.036 } 00:15:40.036 }' 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.036 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.037 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.037 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.295 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.295 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.295 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:40.295 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:40.295 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:40.295 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.295 "name": "pt3", 00:15:40.295 "aliases": [ 00:15:40.295 "00000000-0000-0000-0000-000000000003" 00:15:40.295 ], 00:15:40.295 "product_name": "passthru", 00:15:40.295 "block_size": 512, 00:15:40.295 "num_blocks": 65536, 00:15:40.295 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:40.295 "assigned_rate_limits": { 00:15:40.295 "rw_ios_per_sec": 0, 00:15:40.295 "rw_mbytes_per_sec": 0, 00:15:40.295 "r_mbytes_per_sec": 0, 00:15:40.295 "w_mbytes_per_sec": 0 00:15:40.295 }, 00:15:40.295 "claimed": true, 00:15:40.295 "claim_type": "exclusive_write", 00:15:40.295 "zoned": false, 00:15:40.295 "supported_io_types": { 00:15:40.295 "read": true, 00:15:40.295 "write": true, 00:15:40.295 "unmap": true, 00:15:40.295 "flush": true, 00:15:40.295 "reset": true, 00:15:40.295 "nvme_admin": false, 00:15:40.295 "nvme_io": false, 00:15:40.295 "nvme_io_md": false, 00:15:40.295 "write_zeroes": true, 00:15:40.295 "zcopy": true, 00:15:40.295 "get_zone_info": false, 00:15:40.295 "zone_management": false, 00:15:40.295 "zone_append": false, 00:15:40.295 "compare": false, 00:15:40.295 "compare_and_write": false, 00:15:40.295 "abort": true, 00:15:40.295 "seek_hole": false, 00:15:40.295 "seek_data": false, 00:15:40.295 "copy": true, 00:15:40.295 "nvme_iov_md": false 00:15:40.295 }, 00:15:40.295 "memory_domains": [ 00:15:40.295 { 00:15:40.295 "dma_device_id": "system", 00:15:40.295 "dma_device_type": 1 00:15:40.295 }, 00:15:40.295 { 00:15:40.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.295 "dma_device_type": 2 00:15:40.295 } 00:15:40.295 ], 00:15:40.295 "driver_specific": { 00:15:40.295 "passthru": { 00:15:40.295 "name": "pt3", 00:15:40.296 "base_bdev_name": "malloc3" 00:15:40.296 } 00:15:40.296 } 00:15:40.296 }' 00:15:40.296 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.554 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.554 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.554 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.554 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.554 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.554 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.554 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.554 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.554 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.554 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.812 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.812 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:40.812 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:40.812 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:40.812 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.812 "name": "pt4", 00:15:40.812 "aliases": [ 00:15:40.812 "00000000-0000-0000-0000-000000000004" 00:15:40.812 ], 00:15:40.812 "product_name": "passthru", 00:15:40.812 "block_size": 512, 00:15:40.812 "num_blocks": 65536, 00:15:40.812 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:40.812 "assigned_rate_limits": { 00:15:40.812 "rw_ios_per_sec": 0, 00:15:40.812 "rw_mbytes_per_sec": 0, 00:15:40.812 "r_mbytes_per_sec": 0, 00:15:40.812 "w_mbytes_per_sec": 0 00:15:40.812 }, 00:15:40.812 "claimed": true, 00:15:40.812 "claim_type": "exclusive_write", 00:15:40.812 "zoned": false, 00:15:40.812 "supported_io_types": { 00:15:40.812 "read": true, 00:15:40.812 "write": true, 00:15:40.812 "unmap": true, 00:15:40.812 "flush": true, 00:15:40.812 "reset": true, 00:15:40.812 "nvme_admin": false, 00:15:40.812 "nvme_io": false, 00:15:40.812 "nvme_io_md": false, 00:15:40.812 "write_zeroes": true, 00:15:40.812 "zcopy": true, 00:15:40.812 "get_zone_info": false, 00:15:40.812 "zone_management": false, 00:15:40.812 "zone_append": false, 00:15:40.812 "compare": false, 00:15:40.812 "compare_and_write": false, 00:15:40.812 "abort": true, 00:15:40.812 "seek_hole": false, 00:15:40.812 "seek_data": false, 00:15:40.812 "copy": true, 00:15:40.812 "nvme_iov_md": false 00:15:40.812 }, 00:15:40.812 "memory_domains": [ 00:15:40.812 { 00:15:40.812 "dma_device_id": "system", 00:15:40.812 "dma_device_type": 1 00:15:40.812 }, 00:15:40.812 { 00:15:40.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.812 "dma_device_type": 2 00:15:40.812 } 00:15:40.812 ], 00:15:40.812 "driver_specific": { 00:15:40.812 "passthru": { 00:15:40.812 "name": "pt4", 00:15:40.812 "base_bdev_name": "malloc4" 00:15:40.812 } 00:15:40.812 } 00:15:40.812 }' 00:15:40.812 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.812 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:41.071 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:41.330 [2024-07-15 13:38:28.837073] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:41.330 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac 00:15:41.330 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac ']' 00:15:41.330 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:41.590 [2024-07-15 13:38:29.005297] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:41.590 [2024-07-15 13:38:29.005313] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:41.590 [2024-07-15 13:38:29.005350] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:41.590 [2024-07-15 13:38:29.005393] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:41.590 [2024-07-15 13:38:29.005402] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14ba180 name raid_bdev1, state offline 00:15:41.590 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.590 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:41.590 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:41.590 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:41.590 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.590 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:41.849 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.849 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:42.107 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:42.108 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:42.108 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:42.108 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:15:42.366 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:42.366 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:42.625 [2024-07-15 13:38:30.208366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:42.625 [2024-07-15 13:38:30.209366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:42.625 [2024-07-15 13:38:30.209397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:42.625 [2024-07-15 13:38:30.209420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:15:42.625 [2024-07-15 13:38:30.209453] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:42.625 [2024-07-15 13:38:30.209481] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:42.625 [2024-07-15 13:38:30.209512] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:42.625 [2024-07-15 13:38:30.209526] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:15:42.625 [2024-07-15 13:38:30.209538] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:42.625 [2024-07-15 13:38:30.209546] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x130b8d0 name raid_bdev1, state configuring 00:15:42.625 request: 00:15:42.625 { 00:15:42.625 "name": "raid_bdev1", 00:15:42.625 "raid_level": "raid0", 00:15:42.625 "base_bdevs": [ 00:15:42.625 "malloc1", 00:15:42.625 "malloc2", 00:15:42.625 "malloc3", 00:15:42.625 "malloc4" 00:15:42.625 ], 00:15:42.625 "strip_size_kb": 64, 00:15:42.625 "superblock": false, 00:15:42.625 "method": "bdev_raid_create", 00:15:42.625 "req_id": 1 00:15:42.625 } 00:15:42.625 Got JSON-RPC error response 00:15:42.625 response: 00:15:42.625 { 00:15:42.625 "code": -17, 00:15:42.625 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:42.625 } 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.625 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:42.884 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:42.884 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:42.884 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:43.143 [2024-07-15 13:38:30.553230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:43.143 [2024-07-15 13:38:30.553262] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.143 [2024-07-15 13:38:30.553292] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b6540 00:15:43.143 [2024-07-15 13:38:30.553301] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.143 [2024-07-15 13:38:30.554359] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.143 [2024-07-15 13:38:30.554380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:43.143 [2024-07-15 13:38:30.554422] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:43.143 [2024-07-15 13:38:30.554440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:43.143 pt1 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.143 "name": "raid_bdev1", 00:15:43.143 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:43.143 "strip_size_kb": 64, 00:15:43.143 "state": "configuring", 00:15:43.143 "raid_level": "raid0", 00:15:43.143 "superblock": true, 00:15:43.143 "num_base_bdevs": 4, 00:15:43.143 "num_base_bdevs_discovered": 1, 00:15:43.143 "num_base_bdevs_operational": 4, 00:15:43.143 "base_bdevs_list": [ 00:15:43.143 { 00:15:43.143 "name": "pt1", 00:15:43.143 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:43.143 "is_configured": true, 00:15:43.143 "data_offset": 2048, 00:15:43.143 "data_size": 63488 00:15:43.143 }, 00:15:43.143 { 00:15:43.143 "name": null, 00:15:43.143 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.143 "is_configured": false, 00:15:43.143 "data_offset": 2048, 00:15:43.143 "data_size": 63488 00:15:43.143 }, 00:15:43.143 { 00:15:43.143 "name": null, 00:15:43.143 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:43.143 "is_configured": false, 00:15:43.143 "data_offset": 2048, 00:15:43.143 "data_size": 63488 00:15:43.143 }, 00:15:43.143 { 00:15:43.143 "name": null, 00:15:43.143 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:43.143 "is_configured": false, 00:15:43.143 "data_offset": 2048, 00:15:43.143 "data_size": 63488 00:15:43.143 } 00:15:43.143 ] 00:15:43.143 }' 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.143 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.709 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:15:43.709 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:43.967 [2024-07-15 13:38:31.403420] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:43.967 [2024-07-15 13:38:31.403453] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.967 [2024-07-15 13:38:31.403465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bb4b0 00:15:43.967 [2024-07-15 13:38:31.403473] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.967 [2024-07-15 13:38:31.403712] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.967 [2024-07-15 13:38:31.403724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:43.967 [2024-07-15 13:38:31.403763] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:43.967 [2024-07-15 13:38:31.403776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:43.967 pt2 00:15:43.967 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:43.967 [2024-07-15 13:38:31.579891] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.250 "name": "raid_bdev1", 00:15:44.250 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:44.250 "strip_size_kb": 64, 00:15:44.250 "state": "configuring", 00:15:44.250 "raid_level": "raid0", 00:15:44.250 "superblock": true, 00:15:44.250 "num_base_bdevs": 4, 00:15:44.250 "num_base_bdevs_discovered": 1, 00:15:44.250 "num_base_bdevs_operational": 4, 00:15:44.250 "base_bdevs_list": [ 00:15:44.250 { 00:15:44.250 "name": "pt1", 00:15:44.250 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:44.250 "is_configured": true, 00:15:44.250 "data_offset": 2048, 00:15:44.250 "data_size": 63488 00:15:44.250 }, 00:15:44.250 { 00:15:44.250 "name": null, 00:15:44.250 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.250 "is_configured": false, 00:15:44.250 "data_offset": 2048, 00:15:44.250 "data_size": 63488 00:15:44.250 }, 00:15:44.250 { 00:15:44.250 "name": null, 00:15:44.250 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:44.250 "is_configured": false, 00:15:44.250 "data_offset": 2048, 00:15:44.250 "data_size": 63488 00:15:44.250 }, 00:15:44.250 { 00:15:44.250 "name": null, 00:15:44.250 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:44.250 "is_configured": false, 00:15:44.250 "data_offset": 2048, 00:15:44.250 "data_size": 63488 00:15:44.250 } 00:15:44.250 ] 00:15:44.250 }' 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.250 13:38:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.818 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:44.818 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:44.818 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:44.818 [2024-07-15 13:38:32.418043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:44.818 [2024-07-15 13:38:32.418080] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:44.818 [2024-07-15 13:38:32.418110] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b9020 00:15:44.818 [2024-07-15 13:38:32.418118] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:44.818 [2024-07-15 13:38:32.418365] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:44.818 [2024-07-15 13:38:32.418377] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:44.818 [2024-07-15 13:38:32.418421] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:44.818 [2024-07-15 13:38:32.418435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:44.818 pt2 00:15:45.077 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:45.077 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.077 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:45.077 [2024-07-15 13:38:32.602528] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:45.077 [2024-07-15 13:38:32.602556] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.077 [2024-07-15 13:38:32.602568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b8470 00:15:45.077 [2024-07-15 13:38:32.602577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.077 [2024-07-15 13:38:32.602795] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.077 [2024-07-15 13:38:32.602808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:45.077 [2024-07-15 13:38:32.602847] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:45.077 [2024-07-15 13:38:32.602860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:45.077 pt3 00:15:45.077 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:45.077 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.077 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:45.336 [2024-07-15 13:38:32.782983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:45.336 [2024-07-15 13:38:32.783006] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.336 [2024-07-15 13:38:32.783016] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x130b080 00:15:45.336 [2024-07-15 13:38:32.783040] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.336 [2024-07-15 13:38:32.783234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.336 [2024-07-15 13:38:32.783246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:45.336 [2024-07-15 13:38:32.783278] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:15:45.336 [2024-07-15 13:38:32.783289] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:45.336 [2024-07-15 13:38:32.783370] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14bb7d0 00:15:45.336 [2024-07-15 13:38:32.783377] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:45.336 [2024-07-15 13:38:32.783500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14b9450 00:15:45.336 [2024-07-15 13:38:32.783587] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14bb7d0 00:15:45.336 [2024-07-15 13:38:32.783593] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14bb7d0 00:15:45.336 [2024-07-15 13:38:32.783656] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:45.336 pt4 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.336 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:45.594 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.594 "name": "raid_bdev1", 00:15:45.594 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:45.594 "strip_size_kb": 64, 00:15:45.594 "state": "online", 00:15:45.594 "raid_level": "raid0", 00:15:45.594 "superblock": true, 00:15:45.594 "num_base_bdevs": 4, 00:15:45.594 "num_base_bdevs_discovered": 4, 00:15:45.594 "num_base_bdevs_operational": 4, 00:15:45.594 "base_bdevs_list": [ 00:15:45.594 { 00:15:45.594 "name": "pt1", 00:15:45.594 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:45.594 "is_configured": true, 00:15:45.594 "data_offset": 2048, 00:15:45.594 "data_size": 63488 00:15:45.594 }, 00:15:45.594 { 00:15:45.594 "name": "pt2", 00:15:45.594 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:45.594 "is_configured": true, 00:15:45.594 "data_offset": 2048, 00:15:45.594 "data_size": 63488 00:15:45.594 }, 00:15:45.594 { 00:15:45.594 "name": "pt3", 00:15:45.594 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:45.594 "is_configured": true, 00:15:45.594 "data_offset": 2048, 00:15:45.594 "data_size": 63488 00:15:45.594 }, 00:15:45.594 { 00:15:45.594 "name": "pt4", 00:15:45.594 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:45.594 "is_configured": true, 00:15:45.595 "data_offset": 2048, 00:15:45.595 "data_size": 63488 00:15:45.595 } 00:15:45.595 ] 00:15:45.595 }' 00:15:45.595 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.595 13:38:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.852 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:45.852 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:45.852 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:45.853 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:45.853 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:45.853 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:45.853 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:45.853 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:46.112 [2024-07-15 13:38:33.609332] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:46.112 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:46.112 "name": "raid_bdev1", 00:15:46.112 "aliases": [ 00:15:46.112 "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac" 00:15:46.112 ], 00:15:46.112 "product_name": "Raid Volume", 00:15:46.112 "block_size": 512, 00:15:46.112 "num_blocks": 253952, 00:15:46.112 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:46.112 "assigned_rate_limits": { 00:15:46.112 "rw_ios_per_sec": 0, 00:15:46.112 "rw_mbytes_per_sec": 0, 00:15:46.112 "r_mbytes_per_sec": 0, 00:15:46.112 "w_mbytes_per_sec": 0 00:15:46.112 }, 00:15:46.112 "claimed": false, 00:15:46.112 "zoned": false, 00:15:46.112 "supported_io_types": { 00:15:46.112 "read": true, 00:15:46.112 "write": true, 00:15:46.112 "unmap": true, 00:15:46.112 "flush": true, 00:15:46.112 "reset": true, 00:15:46.112 "nvme_admin": false, 00:15:46.112 "nvme_io": false, 00:15:46.112 "nvme_io_md": false, 00:15:46.112 "write_zeroes": true, 00:15:46.112 "zcopy": false, 00:15:46.112 "get_zone_info": false, 00:15:46.112 "zone_management": false, 00:15:46.112 "zone_append": false, 00:15:46.112 "compare": false, 00:15:46.112 "compare_and_write": false, 00:15:46.112 "abort": false, 00:15:46.112 "seek_hole": false, 00:15:46.112 "seek_data": false, 00:15:46.112 "copy": false, 00:15:46.112 "nvme_iov_md": false 00:15:46.112 }, 00:15:46.112 "memory_domains": [ 00:15:46.112 { 00:15:46.112 "dma_device_id": "system", 00:15:46.112 "dma_device_type": 1 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.112 "dma_device_type": 2 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "dma_device_id": "system", 00:15:46.112 "dma_device_type": 1 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.112 "dma_device_type": 2 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "dma_device_id": "system", 00:15:46.112 "dma_device_type": 1 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.112 "dma_device_type": 2 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "dma_device_id": "system", 00:15:46.112 "dma_device_type": 1 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.112 "dma_device_type": 2 00:15:46.112 } 00:15:46.112 ], 00:15:46.112 "driver_specific": { 00:15:46.112 "raid": { 00:15:46.112 "uuid": "6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac", 00:15:46.112 "strip_size_kb": 64, 00:15:46.112 "state": "online", 00:15:46.112 "raid_level": "raid0", 00:15:46.112 "superblock": true, 00:15:46.112 "num_base_bdevs": 4, 00:15:46.112 "num_base_bdevs_discovered": 4, 00:15:46.112 "num_base_bdevs_operational": 4, 00:15:46.112 "base_bdevs_list": [ 00:15:46.112 { 00:15:46.112 "name": "pt1", 00:15:46.112 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:46.112 "is_configured": true, 00:15:46.112 "data_offset": 2048, 00:15:46.112 "data_size": 63488 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "name": "pt2", 00:15:46.112 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:46.112 "is_configured": true, 00:15:46.112 "data_offset": 2048, 00:15:46.112 "data_size": 63488 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "name": "pt3", 00:15:46.112 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:46.112 "is_configured": true, 00:15:46.112 "data_offset": 2048, 00:15:46.112 "data_size": 63488 00:15:46.112 }, 00:15:46.112 { 00:15:46.112 "name": "pt4", 00:15:46.112 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:46.112 "is_configured": true, 00:15:46.112 "data_offset": 2048, 00:15:46.112 "data_size": 63488 00:15:46.112 } 00:15:46.112 ] 00:15:46.112 } 00:15:46.112 } 00:15:46.112 }' 00:15:46.112 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:46.112 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:46.112 pt2 00:15:46.113 pt3 00:15:46.113 pt4' 00:15:46.113 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.113 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:46.113 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:46.372 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:46.372 "name": "pt1", 00:15:46.372 "aliases": [ 00:15:46.372 "00000000-0000-0000-0000-000000000001" 00:15:46.373 ], 00:15:46.373 "product_name": "passthru", 00:15:46.373 "block_size": 512, 00:15:46.373 "num_blocks": 65536, 00:15:46.373 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:46.373 "assigned_rate_limits": { 00:15:46.373 "rw_ios_per_sec": 0, 00:15:46.373 "rw_mbytes_per_sec": 0, 00:15:46.373 "r_mbytes_per_sec": 0, 00:15:46.373 "w_mbytes_per_sec": 0 00:15:46.373 }, 00:15:46.373 "claimed": true, 00:15:46.373 "claim_type": "exclusive_write", 00:15:46.373 "zoned": false, 00:15:46.373 "supported_io_types": { 00:15:46.373 "read": true, 00:15:46.373 "write": true, 00:15:46.373 "unmap": true, 00:15:46.373 "flush": true, 00:15:46.373 "reset": true, 00:15:46.373 "nvme_admin": false, 00:15:46.373 "nvme_io": false, 00:15:46.373 "nvme_io_md": false, 00:15:46.373 "write_zeroes": true, 00:15:46.373 "zcopy": true, 00:15:46.373 "get_zone_info": false, 00:15:46.373 "zone_management": false, 00:15:46.373 "zone_append": false, 00:15:46.373 "compare": false, 00:15:46.373 "compare_and_write": false, 00:15:46.373 "abort": true, 00:15:46.373 "seek_hole": false, 00:15:46.373 "seek_data": false, 00:15:46.373 "copy": true, 00:15:46.373 "nvme_iov_md": false 00:15:46.373 }, 00:15:46.373 "memory_domains": [ 00:15:46.373 { 00:15:46.373 "dma_device_id": "system", 00:15:46.373 "dma_device_type": 1 00:15:46.373 }, 00:15:46.373 { 00:15:46.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.373 "dma_device_type": 2 00:15:46.373 } 00:15:46.373 ], 00:15:46.373 "driver_specific": { 00:15:46.373 "passthru": { 00:15:46.373 "name": "pt1", 00:15:46.373 "base_bdev_name": "malloc1" 00:15:46.373 } 00:15:46.373 } 00:15:46.373 }' 00:15:46.373 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.373 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.373 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.373 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.373 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.634 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.634 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:46.634 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:46.892 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:46.892 "name": "pt2", 00:15:46.892 "aliases": [ 00:15:46.892 "00000000-0000-0000-0000-000000000002" 00:15:46.892 ], 00:15:46.892 "product_name": "passthru", 00:15:46.892 "block_size": 512, 00:15:46.892 "num_blocks": 65536, 00:15:46.892 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:46.892 "assigned_rate_limits": { 00:15:46.892 "rw_ios_per_sec": 0, 00:15:46.892 "rw_mbytes_per_sec": 0, 00:15:46.892 "r_mbytes_per_sec": 0, 00:15:46.892 "w_mbytes_per_sec": 0 00:15:46.892 }, 00:15:46.892 "claimed": true, 00:15:46.892 "claim_type": "exclusive_write", 00:15:46.892 "zoned": false, 00:15:46.892 "supported_io_types": { 00:15:46.892 "read": true, 00:15:46.892 "write": true, 00:15:46.892 "unmap": true, 00:15:46.892 "flush": true, 00:15:46.892 "reset": true, 00:15:46.892 "nvme_admin": false, 00:15:46.892 "nvme_io": false, 00:15:46.892 "nvme_io_md": false, 00:15:46.892 "write_zeroes": true, 00:15:46.892 "zcopy": true, 00:15:46.892 "get_zone_info": false, 00:15:46.892 "zone_management": false, 00:15:46.892 "zone_append": false, 00:15:46.892 "compare": false, 00:15:46.892 "compare_and_write": false, 00:15:46.892 "abort": true, 00:15:46.892 "seek_hole": false, 00:15:46.892 "seek_data": false, 00:15:46.892 "copy": true, 00:15:46.892 "nvme_iov_md": false 00:15:46.892 }, 00:15:46.892 "memory_domains": [ 00:15:46.892 { 00:15:46.892 "dma_device_id": "system", 00:15:46.892 "dma_device_type": 1 00:15:46.892 }, 00:15:46.892 { 00:15:46.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.892 "dma_device_type": 2 00:15:46.892 } 00:15:46.892 ], 00:15:46.893 "driver_specific": { 00:15:46.893 "passthru": { 00:15:46.893 "name": "pt2", 00:15:46.893 "base_bdev_name": "malloc2" 00:15:46.893 } 00:15:46.893 } 00:15:46.893 }' 00:15:46.893 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.893 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.893 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.893 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.893 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.893 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.893 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.150 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.150 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.150 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.150 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.150 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.150 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.151 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:47.151 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.409 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.409 "name": "pt3", 00:15:47.409 "aliases": [ 00:15:47.409 "00000000-0000-0000-0000-000000000003" 00:15:47.409 ], 00:15:47.409 "product_name": "passthru", 00:15:47.409 "block_size": 512, 00:15:47.409 "num_blocks": 65536, 00:15:47.409 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:47.409 "assigned_rate_limits": { 00:15:47.409 "rw_ios_per_sec": 0, 00:15:47.409 "rw_mbytes_per_sec": 0, 00:15:47.409 "r_mbytes_per_sec": 0, 00:15:47.409 "w_mbytes_per_sec": 0 00:15:47.409 }, 00:15:47.409 "claimed": true, 00:15:47.409 "claim_type": "exclusive_write", 00:15:47.409 "zoned": false, 00:15:47.409 "supported_io_types": { 00:15:47.409 "read": true, 00:15:47.409 "write": true, 00:15:47.409 "unmap": true, 00:15:47.409 "flush": true, 00:15:47.409 "reset": true, 00:15:47.409 "nvme_admin": false, 00:15:47.409 "nvme_io": false, 00:15:47.409 "nvme_io_md": false, 00:15:47.409 "write_zeroes": true, 00:15:47.409 "zcopy": true, 00:15:47.409 "get_zone_info": false, 00:15:47.409 "zone_management": false, 00:15:47.409 "zone_append": false, 00:15:47.409 "compare": false, 00:15:47.409 "compare_and_write": false, 00:15:47.409 "abort": true, 00:15:47.409 "seek_hole": false, 00:15:47.409 "seek_data": false, 00:15:47.409 "copy": true, 00:15:47.409 "nvme_iov_md": false 00:15:47.409 }, 00:15:47.409 "memory_domains": [ 00:15:47.409 { 00:15:47.409 "dma_device_id": "system", 00:15:47.409 "dma_device_type": 1 00:15:47.409 }, 00:15:47.410 { 00:15:47.410 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.410 "dma_device_type": 2 00:15:47.410 } 00:15:47.410 ], 00:15:47.410 "driver_specific": { 00:15:47.410 "passthru": { 00:15:47.410 "name": "pt3", 00:15:47.410 "base_bdev_name": "malloc3" 00:15:47.410 } 00:15:47.410 } 00:15:47.410 }' 00:15:47.410 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.410 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.410 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:47.410 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.410 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.410 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:47.410 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.410 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.667 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.667 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.667 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.667 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.667 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.667 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:47.667 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.926 "name": "pt4", 00:15:47.926 "aliases": [ 00:15:47.926 "00000000-0000-0000-0000-000000000004" 00:15:47.926 ], 00:15:47.926 "product_name": "passthru", 00:15:47.926 "block_size": 512, 00:15:47.926 "num_blocks": 65536, 00:15:47.926 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:47.926 "assigned_rate_limits": { 00:15:47.926 "rw_ios_per_sec": 0, 00:15:47.926 "rw_mbytes_per_sec": 0, 00:15:47.926 "r_mbytes_per_sec": 0, 00:15:47.926 "w_mbytes_per_sec": 0 00:15:47.926 }, 00:15:47.926 "claimed": true, 00:15:47.926 "claim_type": "exclusive_write", 00:15:47.926 "zoned": false, 00:15:47.926 "supported_io_types": { 00:15:47.926 "read": true, 00:15:47.926 "write": true, 00:15:47.926 "unmap": true, 00:15:47.926 "flush": true, 00:15:47.926 "reset": true, 00:15:47.926 "nvme_admin": false, 00:15:47.926 "nvme_io": false, 00:15:47.926 "nvme_io_md": false, 00:15:47.926 "write_zeroes": true, 00:15:47.926 "zcopy": true, 00:15:47.926 "get_zone_info": false, 00:15:47.926 "zone_management": false, 00:15:47.926 "zone_append": false, 00:15:47.926 "compare": false, 00:15:47.926 "compare_and_write": false, 00:15:47.926 "abort": true, 00:15:47.926 "seek_hole": false, 00:15:47.926 "seek_data": false, 00:15:47.926 "copy": true, 00:15:47.926 "nvme_iov_md": false 00:15:47.926 }, 00:15:47.926 "memory_domains": [ 00:15:47.926 { 00:15:47.926 "dma_device_id": "system", 00:15:47.926 "dma_device_type": 1 00:15:47.926 }, 00:15:47.926 { 00:15:47.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.926 "dma_device_type": 2 00:15:47.926 } 00:15:47.926 ], 00:15:47.926 "driver_specific": { 00:15:47.926 "passthru": { 00:15:47.926 "name": "pt4", 00:15:47.926 "base_bdev_name": "malloc4" 00:15:47.926 } 00:15:47.926 } 00:15:47.926 }' 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.926 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:48.184 [2024-07-15 13:38:35.770898] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac '!=' 6c338ec3-fd9d-47d5-83e0-c0db0ec0c9ac ']' 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 29166 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 29166 ']' 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 29166 00:15:48.184 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:48.445 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:48.445 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 29166 00:15:48.445 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:48.445 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:48.445 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 29166' 00:15:48.445 killing process with pid 29166 00:15:48.445 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 29166 00:15:48.445 [2024-07-15 13:38:35.843533] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:48.445 [2024-07-15 13:38:35.843579] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:48.445 [2024-07-15 13:38:35.843623] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:48.445 [2024-07-15 13:38:35.843631] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14bb7d0 name raid_bdev1, state offline 00:15:48.445 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 29166 00:15:48.445 [2024-07-15 13:38:35.880317] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:48.705 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:48.705 00:15:48.705 real 0m12.667s 00:15:48.705 user 0m22.691s 00:15:48.705 sys 0m2.385s 00:15:48.705 13:38:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:48.705 13:38:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.705 ************************************ 00:15:48.705 END TEST raid_superblock_test 00:15:48.705 ************************************ 00:15:48.705 13:38:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:48.705 13:38:36 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:15:48.705 13:38:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:48.705 13:38:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:48.705 13:38:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:48.705 ************************************ 00:15:48.705 START TEST raid_read_error_test 00:15:48.706 ************************************ 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.GDwPW9GcWX 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=31094 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 31094 /var/tmp/spdk-raid.sock 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 31094 ']' 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:48.706 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:48.706 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.706 [2024-07-15 13:38:36.215052] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:15:48.706 [2024-07-15 13:38:36.215099] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid31094 ] 00:15:48.706 [2024-07-15 13:38:36.301877] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:48.964 [2024-07-15 13:38:36.392367] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:48.964 [2024-07-15 13:38:36.448999] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:48.964 [2024-07-15 13:38:36.449027] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:49.530 13:38:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:49.530 13:38:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:49.530 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:49.530 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:49.789 BaseBdev1_malloc 00:15:49.789 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:49.789 true 00:15:49.789 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:50.048 [2024-07-15 13:38:37.492270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:50.048 [2024-07-15 13:38:37.492303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:50.048 [2024-07-15 13:38:37.492335] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19fb990 00:15:50.048 [2024-07-15 13:38:37.492345] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:50.048 [2024-07-15 13:38:37.493709] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:50.048 [2024-07-15 13:38:37.493732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:50.048 BaseBdev1 00:15:50.048 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.048 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:50.048 BaseBdev2_malloc 00:15:50.306 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:50.306 true 00:15:50.306 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:50.564 [2024-07-15 13:38:37.981969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:50.564 [2024-07-15 13:38:37.982008] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:50.564 [2024-07-15 13:38:37.982040] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a001d0 00:15:50.564 [2024-07-15 13:38:37.982049] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:50.564 [2024-07-15 13:38:37.983167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:50.564 [2024-07-15 13:38:37.983190] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:50.564 BaseBdev2 00:15:50.564 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.564 13:38:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:50.564 BaseBdev3_malloc 00:15:50.564 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:50.823 true 00:15:50.823 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:51.081 [2024-07-15 13:38:38.487110] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:51.081 [2024-07-15 13:38:38.487157] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.081 [2024-07-15 13:38:38.487188] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a02490 00:15:51.081 [2024-07-15 13:38:38.487197] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.081 [2024-07-15 13:38:38.488374] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.081 [2024-07-15 13:38:38.488402] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:51.081 BaseBdev3 00:15:51.081 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:51.081 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:51.081 BaseBdev4_malloc 00:15:51.081 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:51.339 true 00:15:51.339 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:51.598 [2024-07-15 13:38:38.989229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:51.598 [2024-07-15 13:38:38.989260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.598 [2024-07-15 13:38:38.989292] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a03360 00:15:51.598 [2024-07-15 13:38:38.989301] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.598 [2024-07-15 13:38:38.990442] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.598 [2024-07-15 13:38:38.990464] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:51.598 BaseBdev4 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:51.598 [2024-07-15 13:38:39.149682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:51.598 [2024-07-15 13:38:39.150706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:51.598 [2024-07-15 13:38:39.150759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:51.598 [2024-07-15 13:38:39.150801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:51.598 [2024-07-15 13:38:39.150969] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19fd4e0 00:15:51.598 [2024-07-15 13:38:39.150977] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:51.598 [2024-07-15 13:38:39.151134] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1851b20 00:15:51.598 [2024-07-15 13:38:39.151246] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19fd4e0 00:15:51.598 [2024-07-15 13:38:39.151252] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19fd4e0 00:15:51.598 [2024-07-15 13:38:39.151326] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.598 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:51.857 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.857 "name": "raid_bdev1", 00:15:51.857 "uuid": "d177271a-e5e0-4f37-bcdf-eca1c85d486e", 00:15:51.857 "strip_size_kb": 64, 00:15:51.857 "state": "online", 00:15:51.857 "raid_level": "raid0", 00:15:51.857 "superblock": true, 00:15:51.857 "num_base_bdevs": 4, 00:15:51.857 "num_base_bdevs_discovered": 4, 00:15:51.857 "num_base_bdevs_operational": 4, 00:15:51.857 "base_bdevs_list": [ 00:15:51.857 { 00:15:51.857 "name": "BaseBdev1", 00:15:51.857 "uuid": "7c110ab7-937d-5ad7-aad0-6bf3af3e330d", 00:15:51.857 "is_configured": true, 00:15:51.857 "data_offset": 2048, 00:15:51.857 "data_size": 63488 00:15:51.857 }, 00:15:51.857 { 00:15:51.857 "name": "BaseBdev2", 00:15:51.857 "uuid": "4be74ce4-4684-552d-b715-660c1f9e6df4", 00:15:51.857 "is_configured": true, 00:15:51.857 "data_offset": 2048, 00:15:51.857 "data_size": 63488 00:15:51.857 }, 00:15:51.857 { 00:15:51.857 "name": "BaseBdev3", 00:15:51.857 "uuid": "459ff09d-85ea-5ddd-b45b-13398a7a8122", 00:15:51.857 "is_configured": true, 00:15:51.857 "data_offset": 2048, 00:15:51.857 "data_size": 63488 00:15:51.857 }, 00:15:51.857 { 00:15:51.857 "name": "BaseBdev4", 00:15:51.857 "uuid": "12ce4536-a08a-5e14-b489-4ecd5bb7b177", 00:15:51.857 "is_configured": true, 00:15:51.857 "data_offset": 2048, 00:15:51.857 "data_size": 63488 00:15:51.857 } 00:15:51.857 ] 00:15:51.857 }' 00:15:51.857 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.857 13:38:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.425 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:52.425 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:52.425 [2024-07-15 13:38:39.923883] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ef880 00:15:53.408 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:53.671 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.671 "name": "raid_bdev1", 00:15:53.671 "uuid": "d177271a-e5e0-4f37-bcdf-eca1c85d486e", 00:15:53.672 "strip_size_kb": 64, 00:15:53.672 "state": "online", 00:15:53.672 "raid_level": "raid0", 00:15:53.672 "superblock": true, 00:15:53.672 "num_base_bdevs": 4, 00:15:53.672 "num_base_bdevs_discovered": 4, 00:15:53.672 "num_base_bdevs_operational": 4, 00:15:53.672 "base_bdevs_list": [ 00:15:53.672 { 00:15:53.672 "name": "BaseBdev1", 00:15:53.672 "uuid": "7c110ab7-937d-5ad7-aad0-6bf3af3e330d", 00:15:53.672 "is_configured": true, 00:15:53.672 "data_offset": 2048, 00:15:53.672 "data_size": 63488 00:15:53.672 }, 00:15:53.672 { 00:15:53.672 "name": "BaseBdev2", 00:15:53.672 "uuid": "4be74ce4-4684-552d-b715-660c1f9e6df4", 00:15:53.672 "is_configured": true, 00:15:53.672 "data_offset": 2048, 00:15:53.672 "data_size": 63488 00:15:53.672 }, 00:15:53.672 { 00:15:53.672 "name": "BaseBdev3", 00:15:53.672 "uuid": "459ff09d-85ea-5ddd-b45b-13398a7a8122", 00:15:53.672 "is_configured": true, 00:15:53.672 "data_offset": 2048, 00:15:53.672 "data_size": 63488 00:15:53.672 }, 00:15:53.672 { 00:15:53.672 "name": "BaseBdev4", 00:15:53.672 "uuid": "12ce4536-a08a-5e14-b489-4ecd5bb7b177", 00:15:53.672 "is_configured": true, 00:15:53.672 "data_offset": 2048, 00:15:53.672 "data_size": 63488 00:15:53.672 } 00:15:53.672 ] 00:15:53.672 }' 00:15:53.672 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.672 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.238 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:54.496 [2024-07-15 13:38:41.865467] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:54.496 [2024-07-15 13:38:41.865496] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:54.496 [2024-07-15 13:38:41.867586] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:54.496 [2024-07-15 13:38:41.867614] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:54.496 [2024-07-15 13:38:41.867641] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:54.496 [2024-07-15 13:38:41.867648] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19fd4e0 name raid_bdev1, state offline 00:15:54.496 0 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 31094 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 31094 ']' 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 31094 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 31094 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 31094' 00:15:54.496 killing process with pid 31094 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 31094 00:15:54.496 [2024-07-15 13:38:41.934758] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:54.496 13:38:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 31094 00:15:54.496 [2024-07-15 13:38:41.964449] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.GDwPW9GcWX 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:15:54.754 00:15:54.754 real 0m6.029s 00:15:54.754 user 0m9.312s 00:15:54.754 sys 0m1.021s 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:54.754 13:38:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.754 ************************************ 00:15:54.754 END TEST raid_read_error_test 00:15:54.754 ************************************ 00:15:54.754 13:38:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:54.754 13:38:42 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:15:54.754 13:38:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:54.754 13:38:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:54.754 13:38:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:54.754 ************************************ 00:15:54.754 START TEST raid_write_error_test 00:15:54.754 ************************************ 00:15:54.754 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:15:54.754 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:54.754 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:54.754 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:54.754 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.BeNdCsXmJ6 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=32068 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 32068 /var/tmp/spdk-raid.sock 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 32068 ']' 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:54.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:54.755 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.755 [2024-07-15 13:38:42.337612] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:15:54.755 [2024-07-15 13:38:42.337672] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid32068 ] 00:15:55.013 [2024-07-15 13:38:42.425142] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:55.013 [2024-07-15 13:38:42.517925] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:55.013 [2024-07-15 13:38:42.576206] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:55.013 [2024-07-15 13:38:42.576233] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:55.598 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:55.598 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:55.598 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:55.598 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:55.855 BaseBdev1_malloc 00:15:55.855 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:56.113 true 00:15:56.113 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:56.113 [2024-07-15 13:38:43.653589] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:56.113 [2024-07-15 13:38:43.653627] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.113 [2024-07-15 13:38:43.653640] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe27990 00:15:56.113 [2024-07-15 13:38:43.653648] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.113 [2024-07-15 13:38:43.654828] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.113 [2024-07-15 13:38:43.654849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:56.113 BaseBdev1 00:15:56.113 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:56.113 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:56.371 BaseBdev2_malloc 00:15:56.371 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:56.629 true 00:15:56.629 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:56.629 [2024-07-15 13:38:44.170642] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:56.629 [2024-07-15 13:38:44.170680] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.629 [2024-07-15 13:38:44.170694] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2c1d0 00:15:56.629 [2024-07-15 13:38:44.170703] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.629 [2024-07-15 13:38:44.171761] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.629 [2024-07-15 13:38:44.171784] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:56.629 BaseBdev2 00:15:56.630 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:56.630 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:56.888 BaseBdev3_malloc 00:15:56.888 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:57.147 true 00:15:57.147 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:57.147 [2024-07-15 13:38:44.703757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:57.147 [2024-07-15 13:38:44.703810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.147 [2024-07-15 13:38:44.703824] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2e490 00:15:57.147 [2024-07-15 13:38:44.703832] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.147 [2024-07-15 13:38:44.704889] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.147 [2024-07-15 13:38:44.704910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:57.147 BaseBdev3 00:15:57.147 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:57.147 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:57.406 BaseBdev4_malloc 00:15:57.406 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:57.665 true 00:15:57.665 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:57.665 [2024-07-15 13:38:45.240912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:57.665 [2024-07-15 13:38:45.240947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.665 [2024-07-15 13:38:45.240960] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2f360 00:15:57.665 [2024-07-15 13:38:45.240969] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.665 [2024-07-15 13:38:45.241926] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.665 [2024-07-15 13:38:45.241946] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:57.665 BaseBdev4 00:15:57.665 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:57.925 [2024-07-15 13:38:45.413382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:57.925 [2024-07-15 13:38:45.414199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:57.925 [2024-07-15 13:38:45.414255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:57.925 [2024-07-15 13:38:45.414296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:57.925 [2024-07-15 13:38:45.414452] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe294e0 00:15:57.925 [2024-07-15 13:38:45.414459] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:57.925 [2024-07-15 13:38:45.414580] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7db20 00:15:57.925 [2024-07-15 13:38:45.414678] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe294e0 00:15:57.925 [2024-07-15 13:38:45.414685] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe294e0 00:15:57.925 [2024-07-15 13:38:45.414754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.925 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:58.184 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.184 "name": "raid_bdev1", 00:15:58.184 "uuid": "2e18430f-e15d-4df1-aef9-64c526c940c5", 00:15:58.184 "strip_size_kb": 64, 00:15:58.184 "state": "online", 00:15:58.184 "raid_level": "raid0", 00:15:58.184 "superblock": true, 00:15:58.184 "num_base_bdevs": 4, 00:15:58.184 "num_base_bdevs_discovered": 4, 00:15:58.184 "num_base_bdevs_operational": 4, 00:15:58.184 "base_bdevs_list": [ 00:15:58.184 { 00:15:58.184 "name": "BaseBdev1", 00:15:58.184 "uuid": "c80667e2-6504-55ce-8ce8-181d166362d8", 00:15:58.184 "is_configured": true, 00:15:58.184 "data_offset": 2048, 00:15:58.184 "data_size": 63488 00:15:58.184 }, 00:15:58.184 { 00:15:58.184 "name": "BaseBdev2", 00:15:58.184 "uuid": "4ffcebed-adbb-5ce0-a4bf-a55f0cadbe36", 00:15:58.184 "is_configured": true, 00:15:58.184 "data_offset": 2048, 00:15:58.184 "data_size": 63488 00:15:58.184 }, 00:15:58.184 { 00:15:58.184 "name": "BaseBdev3", 00:15:58.184 "uuid": "570accd5-3b24-59f6-b758-7a4a863d3ccc", 00:15:58.184 "is_configured": true, 00:15:58.184 "data_offset": 2048, 00:15:58.184 "data_size": 63488 00:15:58.184 }, 00:15:58.184 { 00:15:58.184 "name": "BaseBdev4", 00:15:58.184 "uuid": "a16283b4-e627-597f-b454-04903f7891ba", 00:15:58.184 "is_configured": true, 00:15:58.184 "data_offset": 2048, 00:15:58.184 "data_size": 63488 00:15:58.184 } 00:15:58.184 ] 00:15:58.184 }' 00:15:58.184 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.184 13:38:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:58.752 13:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:58.752 13:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:58.752 [2024-07-15 13:38:46.183644] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe1b880 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:59.690 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.948 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.948 "name": "raid_bdev1", 00:15:59.948 "uuid": "2e18430f-e15d-4df1-aef9-64c526c940c5", 00:15:59.948 "strip_size_kb": 64, 00:15:59.948 "state": "online", 00:15:59.948 "raid_level": "raid0", 00:15:59.948 "superblock": true, 00:15:59.948 "num_base_bdevs": 4, 00:15:59.948 "num_base_bdevs_discovered": 4, 00:15:59.948 "num_base_bdevs_operational": 4, 00:15:59.948 "base_bdevs_list": [ 00:15:59.948 { 00:15:59.948 "name": "BaseBdev1", 00:15:59.948 "uuid": "c80667e2-6504-55ce-8ce8-181d166362d8", 00:15:59.948 "is_configured": true, 00:15:59.948 "data_offset": 2048, 00:15:59.948 "data_size": 63488 00:15:59.948 }, 00:15:59.948 { 00:15:59.948 "name": "BaseBdev2", 00:15:59.948 "uuid": "4ffcebed-adbb-5ce0-a4bf-a55f0cadbe36", 00:15:59.949 "is_configured": true, 00:15:59.949 "data_offset": 2048, 00:15:59.949 "data_size": 63488 00:15:59.949 }, 00:15:59.949 { 00:15:59.949 "name": "BaseBdev3", 00:15:59.949 "uuid": "570accd5-3b24-59f6-b758-7a4a863d3ccc", 00:15:59.949 "is_configured": true, 00:15:59.949 "data_offset": 2048, 00:15:59.949 "data_size": 63488 00:15:59.949 }, 00:15:59.949 { 00:15:59.949 "name": "BaseBdev4", 00:15:59.949 "uuid": "a16283b4-e627-597f-b454-04903f7891ba", 00:15:59.949 "is_configured": true, 00:15:59.949 "data_offset": 2048, 00:15:59.949 "data_size": 63488 00:15:59.949 } 00:15:59.949 ] 00:15:59.949 }' 00:15:59.949 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.949 13:38:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.515 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:00.515 [2024-07-15 13:38:48.129436] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:00.515 [2024-07-15 13:38:48.129471] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:00.515 [2024-07-15 13:38:48.131700] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:00.515 [2024-07-15 13:38:48.131728] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:00.515 [2024-07-15 13:38:48.131756] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:00.515 [2024-07-15 13:38:48.131764] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe294e0 name raid_bdev1, state offline 00:16:00.773 0 00:16:00.773 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 32068 00:16:00.773 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 32068 ']' 00:16:00.773 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 32068 00:16:00.773 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:00.773 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:00.773 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 32068 00:16:00.774 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:00.774 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:00.774 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 32068' 00:16:00.774 killing process with pid 32068 00:16:00.774 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 32068 00:16:00.774 [2024-07-15 13:38:48.196637] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:00.774 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 32068 00:16:00.774 [2024-07-15 13:38:48.227092] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.BeNdCsXmJ6 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:16:01.032 00:16:01.032 real 0m6.179s 00:16:01.032 user 0m9.497s 00:16:01.032 sys 0m1.148s 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:01.032 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.032 ************************************ 00:16:01.032 END TEST raid_write_error_test 00:16:01.032 ************************************ 00:16:01.032 13:38:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:01.032 13:38:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:01.032 13:38:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:16:01.032 13:38:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:01.032 13:38:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:01.032 13:38:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:01.032 ************************************ 00:16:01.032 START TEST raid_state_function_test 00:16:01.032 ************************************ 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=32886 00:16:01.032 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 32886' 00:16:01.033 Process raid pid: 32886 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 32886 /var/tmp/spdk-raid.sock 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 32886 ']' 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:01.033 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:01.033 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.033 [2024-07-15 13:38:48.597012] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:16:01.033 [2024-07-15 13:38:48.597069] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:01.291 [2024-07-15 13:38:48.685926] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:01.291 [2024-07-15 13:38:48.772888] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:01.291 [2024-07-15 13:38:48.830036] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:01.291 [2024-07-15 13:38:48.830061] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:01.857 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:01.857 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:01.857 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:02.116 [2024-07-15 13:38:49.555941] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:02.116 [2024-07-15 13:38:49.555976] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:02.116 [2024-07-15 13:38:49.555983] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:02.116 [2024-07-15 13:38:49.555991] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:02.116 [2024-07-15 13:38:49.556001] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:02.116 [2024-07-15 13:38:49.556009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:02.116 [2024-07-15 13:38:49.556014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:02.116 [2024-07-15 13:38:49.556022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.116 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.374 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.374 "name": "Existed_Raid", 00:16:02.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.374 "strip_size_kb": 64, 00:16:02.374 "state": "configuring", 00:16:02.374 "raid_level": "concat", 00:16:02.374 "superblock": false, 00:16:02.374 "num_base_bdevs": 4, 00:16:02.374 "num_base_bdevs_discovered": 0, 00:16:02.374 "num_base_bdevs_operational": 4, 00:16:02.374 "base_bdevs_list": [ 00:16:02.374 { 00:16:02.374 "name": "BaseBdev1", 00:16:02.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.374 "is_configured": false, 00:16:02.374 "data_offset": 0, 00:16:02.374 "data_size": 0 00:16:02.374 }, 00:16:02.374 { 00:16:02.374 "name": "BaseBdev2", 00:16:02.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.374 "is_configured": false, 00:16:02.374 "data_offset": 0, 00:16:02.374 "data_size": 0 00:16:02.374 }, 00:16:02.374 { 00:16:02.374 "name": "BaseBdev3", 00:16:02.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.374 "is_configured": false, 00:16:02.374 "data_offset": 0, 00:16:02.374 "data_size": 0 00:16:02.374 }, 00:16:02.374 { 00:16:02.374 "name": "BaseBdev4", 00:16:02.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.375 "is_configured": false, 00:16:02.375 "data_offset": 0, 00:16:02.375 "data_size": 0 00:16:02.375 } 00:16:02.375 ] 00:16:02.375 }' 00:16:02.375 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.375 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.632 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:02.891 [2024-07-15 13:38:50.406043] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:02.891 [2024-07-15 13:38:50.406069] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ceef70 name Existed_Raid, state configuring 00:16:02.891 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:03.150 [2024-07-15 13:38:50.582516] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:03.150 [2024-07-15 13:38:50.582546] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:03.150 [2024-07-15 13:38:50.582552] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:03.150 [2024-07-15 13:38:50.582560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:03.150 [2024-07-15 13:38:50.582565] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:03.150 [2024-07-15 13:38:50.582573] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:03.150 [2024-07-15 13:38:50.582578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:03.150 [2024-07-15 13:38:50.582589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:03.150 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:03.150 [2024-07-15 13:38:50.767856] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:03.408 BaseBdev1 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.408 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:03.666 [ 00:16:03.666 { 00:16:03.666 "name": "BaseBdev1", 00:16:03.666 "aliases": [ 00:16:03.666 "7802b948-486a-4fba-bf73-06b13bf27db9" 00:16:03.666 ], 00:16:03.666 "product_name": "Malloc disk", 00:16:03.666 "block_size": 512, 00:16:03.666 "num_blocks": 65536, 00:16:03.666 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:03.666 "assigned_rate_limits": { 00:16:03.666 "rw_ios_per_sec": 0, 00:16:03.666 "rw_mbytes_per_sec": 0, 00:16:03.666 "r_mbytes_per_sec": 0, 00:16:03.666 "w_mbytes_per_sec": 0 00:16:03.666 }, 00:16:03.666 "claimed": true, 00:16:03.666 "claim_type": "exclusive_write", 00:16:03.666 "zoned": false, 00:16:03.666 "supported_io_types": { 00:16:03.666 "read": true, 00:16:03.666 "write": true, 00:16:03.666 "unmap": true, 00:16:03.666 "flush": true, 00:16:03.666 "reset": true, 00:16:03.666 "nvme_admin": false, 00:16:03.666 "nvme_io": false, 00:16:03.666 "nvme_io_md": false, 00:16:03.666 "write_zeroes": true, 00:16:03.666 "zcopy": true, 00:16:03.666 "get_zone_info": false, 00:16:03.666 "zone_management": false, 00:16:03.666 "zone_append": false, 00:16:03.666 "compare": false, 00:16:03.666 "compare_and_write": false, 00:16:03.666 "abort": true, 00:16:03.666 "seek_hole": false, 00:16:03.666 "seek_data": false, 00:16:03.666 "copy": true, 00:16:03.666 "nvme_iov_md": false 00:16:03.666 }, 00:16:03.666 "memory_domains": [ 00:16:03.666 { 00:16:03.666 "dma_device_id": "system", 00:16:03.666 "dma_device_type": 1 00:16:03.666 }, 00:16:03.666 { 00:16:03.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.666 "dma_device_type": 2 00:16:03.666 } 00:16:03.666 ], 00:16:03.666 "driver_specific": {} 00:16:03.666 } 00:16:03.666 ] 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.666 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.925 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.925 "name": "Existed_Raid", 00:16:03.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.925 "strip_size_kb": 64, 00:16:03.925 "state": "configuring", 00:16:03.925 "raid_level": "concat", 00:16:03.925 "superblock": false, 00:16:03.925 "num_base_bdevs": 4, 00:16:03.925 "num_base_bdevs_discovered": 1, 00:16:03.925 "num_base_bdevs_operational": 4, 00:16:03.925 "base_bdevs_list": [ 00:16:03.925 { 00:16:03.925 "name": "BaseBdev1", 00:16:03.925 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:03.925 "is_configured": true, 00:16:03.925 "data_offset": 0, 00:16:03.925 "data_size": 65536 00:16:03.925 }, 00:16:03.925 { 00:16:03.925 "name": "BaseBdev2", 00:16:03.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.925 "is_configured": false, 00:16:03.925 "data_offset": 0, 00:16:03.925 "data_size": 0 00:16:03.925 }, 00:16:03.925 { 00:16:03.925 "name": "BaseBdev3", 00:16:03.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.925 "is_configured": false, 00:16:03.925 "data_offset": 0, 00:16:03.925 "data_size": 0 00:16:03.925 }, 00:16:03.925 { 00:16:03.925 "name": "BaseBdev4", 00:16:03.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.925 "is_configured": false, 00:16:03.925 "data_offset": 0, 00:16:03.925 "data_size": 0 00:16:03.925 } 00:16:03.925 ] 00:16:03.925 }' 00:16:03.925 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.925 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.491 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:04.491 [2024-07-15 13:38:51.966951] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:04.491 [2024-07-15 13:38:51.966986] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cee7e0 name Existed_Raid, state configuring 00:16:04.491 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:04.749 [2024-07-15 13:38:52.139411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:04.749 [2024-07-15 13:38:52.140415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:04.749 [2024-07-15 13:38:52.140440] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:04.749 [2024-07-15 13:38:52.140447] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:04.749 [2024-07-15 13:38:52.140454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:04.749 [2024-07-15 13:38:52.140460] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:04.749 [2024-07-15 13:38:52.140466] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.749 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.749 "name": "Existed_Raid", 00:16:04.749 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.749 "strip_size_kb": 64, 00:16:04.749 "state": "configuring", 00:16:04.749 "raid_level": "concat", 00:16:04.749 "superblock": false, 00:16:04.749 "num_base_bdevs": 4, 00:16:04.749 "num_base_bdevs_discovered": 1, 00:16:04.749 "num_base_bdevs_operational": 4, 00:16:04.749 "base_bdevs_list": [ 00:16:04.749 { 00:16:04.749 "name": "BaseBdev1", 00:16:04.749 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:04.749 "is_configured": true, 00:16:04.749 "data_offset": 0, 00:16:04.749 "data_size": 65536 00:16:04.749 }, 00:16:04.749 { 00:16:04.749 "name": "BaseBdev2", 00:16:04.749 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.749 "is_configured": false, 00:16:04.749 "data_offset": 0, 00:16:04.749 "data_size": 0 00:16:04.749 }, 00:16:04.749 { 00:16:04.749 "name": "BaseBdev3", 00:16:04.749 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.749 "is_configured": false, 00:16:04.749 "data_offset": 0, 00:16:04.749 "data_size": 0 00:16:04.749 }, 00:16:04.749 { 00:16:04.749 "name": "BaseBdev4", 00:16:04.749 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.750 "is_configured": false, 00:16:04.750 "data_offset": 0, 00:16:04.750 "data_size": 0 00:16:04.750 } 00:16:04.750 ] 00:16:04.750 }' 00:16:04.750 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.750 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:05.317 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:05.576 [2024-07-15 13:38:52.980356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:05.576 BaseBdev2 00:16:05.576 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:05.576 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:05.576 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.576 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:05.576 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.576 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.576 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.576 13:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:05.835 [ 00:16:05.835 { 00:16:05.835 "name": "BaseBdev2", 00:16:05.835 "aliases": [ 00:16:05.835 "6d90628a-9c78-424e-b7f4-737949225cb5" 00:16:05.835 ], 00:16:05.835 "product_name": "Malloc disk", 00:16:05.835 "block_size": 512, 00:16:05.835 "num_blocks": 65536, 00:16:05.835 "uuid": "6d90628a-9c78-424e-b7f4-737949225cb5", 00:16:05.835 "assigned_rate_limits": { 00:16:05.835 "rw_ios_per_sec": 0, 00:16:05.835 "rw_mbytes_per_sec": 0, 00:16:05.835 "r_mbytes_per_sec": 0, 00:16:05.835 "w_mbytes_per_sec": 0 00:16:05.835 }, 00:16:05.835 "claimed": true, 00:16:05.835 "claim_type": "exclusive_write", 00:16:05.835 "zoned": false, 00:16:05.835 "supported_io_types": { 00:16:05.835 "read": true, 00:16:05.835 "write": true, 00:16:05.835 "unmap": true, 00:16:05.835 "flush": true, 00:16:05.835 "reset": true, 00:16:05.835 "nvme_admin": false, 00:16:05.835 "nvme_io": false, 00:16:05.835 "nvme_io_md": false, 00:16:05.835 "write_zeroes": true, 00:16:05.835 "zcopy": true, 00:16:05.835 "get_zone_info": false, 00:16:05.835 "zone_management": false, 00:16:05.835 "zone_append": false, 00:16:05.835 "compare": false, 00:16:05.835 "compare_and_write": false, 00:16:05.835 "abort": true, 00:16:05.835 "seek_hole": false, 00:16:05.835 "seek_data": false, 00:16:05.835 "copy": true, 00:16:05.835 "nvme_iov_md": false 00:16:05.835 }, 00:16:05.835 "memory_domains": [ 00:16:05.835 { 00:16:05.835 "dma_device_id": "system", 00:16:05.835 "dma_device_type": 1 00:16:05.835 }, 00:16:05.835 { 00:16:05.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.835 "dma_device_type": 2 00:16:05.835 } 00:16:05.835 ], 00:16:05.835 "driver_specific": {} 00:16:05.835 } 00:16:05.835 ] 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.835 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.094 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.094 "name": "Existed_Raid", 00:16:06.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.094 "strip_size_kb": 64, 00:16:06.094 "state": "configuring", 00:16:06.094 "raid_level": "concat", 00:16:06.094 "superblock": false, 00:16:06.094 "num_base_bdevs": 4, 00:16:06.094 "num_base_bdevs_discovered": 2, 00:16:06.094 "num_base_bdevs_operational": 4, 00:16:06.094 "base_bdevs_list": [ 00:16:06.094 { 00:16:06.094 "name": "BaseBdev1", 00:16:06.094 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:06.094 "is_configured": true, 00:16:06.094 "data_offset": 0, 00:16:06.094 "data_size": 65536 00:16:06.094 }, 00:16:06.094 { 00:16:06.094 "name": "BaseBdev2", 00:16:06.094 "uuid": "6d90628a-9c78-424e-b7f4-737949225cb5", 00:16:06.094 "is_configured": true, 00:16:06.094 "data_offset": 0, 00:16:06.094 "data_size": 65536 00:16:06.094 }, 00:16:06.094 { 00:16:06.094 "name": "BaseBdev3", 00:16:06.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.094 "is_configured": false, 00:16:06.094 "data_offset": 0, 00:16:06.094 "data_size": 0 00:16:06.094 }, 00:16:06.094 { 00:16:06.094 "name": "BaseBdev4", 00:16:06.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.094 "is_configured": false, 00:16:06.094 "data_offset": 0, 00:16:06.094 "data_size": 0 00:16:06.094 } 00:16:06.094 ] 00:16:06.094 }' 00:16:06.094 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.094 13:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:06.661 [2024-07-15 13:38:54.179217] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:06.661 BaseBdev3 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:06.661 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:06.918 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:07.178 [ 00:16:07.178 { 00:16:07.178 "name": "BaseBdev3", 00:16:07.178 "aliases": [ 00:16:07.178 "8b1a1826-b0bb-4bbe-9798-ad667e4f862f" 00:16:07.178 ], 00:16:07.178 "product_name": "Malloc disk", 00:16:07.178 "block_size": 512, 00:16:07.178 "num_blocks": 65536, 00:16:07.178 "uuid": "8b1a1826-b0bb-4bbe-9798-ad667e4f862f", 00:16:07.178 "assigned_rate_limits": { 00:16:07.178 "rw_ios_per_sec": 0, 00:16:07.178 "rw_mbytes_per_sec": 0, 00:16:07.178 "r_mbytes_per_sec": 0, 00:16:07.178 "w_mbytes_per_sec": 0 00:16:07.178 }, 00:16:07.178 "claimed": true, 00:16:07.178 "claim_type": "exclusive_write", 00:16:07.178 "zoned": false, 00:16:07.178 "supported_io_types": { 00:16:07.178 "read": true, 00:16:07.178 "write": true, 00:16:07.178 "unmap": true, 00:16:07.178 "flush": true, 00:16:07.178 "reset": true, 00:16:07.178 "nvme_admin": false, 00:16:07.178 "nvme_io": false, 00:16:07.178 "nvme_io_md": false, 00:16:07.178 "write_zeroes": true, 00:16:07.178 "zcopy": true, 00:16:07.178 "get_zone_info": false, 00:16:07.178 "zone_management": false, 00:16:07.178 "zone_append": false, 00:16:07.178 "compare": false, 00:16:07.178 "compare_and_write": false, 00:16:07.178 "abort": true, 00:16:07.178 "seek_hole": false, 00:16:07.178 "seek_data": false, 00:16:07.178 "copy": true, 00:16:07.178 "nvme_iov_md": false 00:16:07.178 }, 00:16:07.178 "memory_domains": [ 00:16:07.178 { 00:16:07.178 "dma_device_id": "system", 00:16:07.178 "dma_device_type": 1 00:16:07.178 }, 00:16:07.178 { 00:16:07.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.178 "dma_device_type": 2 00:16:07.178 } 00:16:07.178 ], 00:16:07.178 "driver_specific": {} 00:16:07.178 } 00:16:07.178 ] 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.178 "name": "Existed_Raid", 00:16:07.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.178 "strip_size_kb": 64, 00:16:07.178 "state": "configuring", 00:16:07.178 "raid_level": "concat", 00:16:07.178 "superblock": false, 00:16:07.178 "num_base_bdevs": 4, 00:16:07.178 "num_base_bdevs_discovered": 3, 00:16:07.178 "num_base_bdevs_operational": 4, 00:16:07.178 "base_bdevs_list": [ 00:16:07.178 { 00:16:07.178 "name": "BaseBdev1", 00:16:07.178 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:07.178 "is_configured": true, 00:16:07.178 "data_offset": 0, 00:16:07.178 "data_size": 65536 00:16:07.178 }, 00:16:07.178 { 00:16:07.178 "name": "BaseBdev2", 00:16:07.178 "uuid": "6d90628a-9c78-424e-b7f4-737949225cb5", 00:16:07.178 "is_configured": true, 00:16:07.178 "data_offset": 0, 00:16:07.178 "data_size": 65536 00:16:07.178 }, 00:16:07.178 { 00:16:07.178 "name": "BaseBdev3", 00:16:07.178 "uuid": "8b1a1826-b0bb-4bbe-9798-ad667e4f862f", 00:16:07.178 "is_configured": true, 00:16:07.178 "data_offset": 0, 00:16:07.178 "data_size": 65536 00:16:07.178 }, 00:16:07.178 { 00:16:07.178 "name": "BaseBdev4", 00:16:07.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.178 "is_configured": false, 00:16:07.178 "data_offset": 0, 00:16:07.178 "data_size": 0 00:16:07.178 } 00:16:07.178 ] 00:16:07.178 }' 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.178 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.742 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:08.000 [2024-07-15 13:38:55.397252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:08.000 [2024-07-15 13:38:55.397285] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cef840 00:16:08.000 [2024-07-15 13:38:55.397291] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:08.000 [2024-07-15 13:38:55.397441] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cef480 00:16:08.000 [2024-07-15 13:38:55.397528] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cef840 00:16:08.000 [2024-07-15 13:38:55.397534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cef840 00:16:08.000 [2024-07-15 13:38:55.397655] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:08.000 BaseBdev4 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:08.000 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:08.257 [ 00:16:08.257 { 00:16:08.257 "name": "BaseBdev4", 00:16:08.257 "aliases": [ 00:16:08.257 "46d79394-bba2-4b47-9160-a08a8bc04453" 00:16:08.257 ], 00:16:08.257 "product_name": "Malloc disk", 00:16:08.257 "block_size": 512, 00:16:08.257 "num_blocks": 65536, 00:16:08.257 "uuid": "46d79394-bba2-4b47-9160-a08a8bc04453", 00:16:08.257 "assigned_rate_limits": { 00:16:08.257 "rw_ios_per_sec": 0, 00:16:08.257 "rw_mbytes_per_sec": 0, 00:16:08.257 "r_mbytes_per_sec": 0, 00:16:08.257 "w_mbytes_per_sec": 0 00:16:08.257 }, 00:16:08.257 "claimed": true, 00:16:08.257 "claim_type": "exclusive_write", 00:16:08.257 "zoned": false, 00:16:08.257 "supported_io_types": { 00:16:08.257 "read": true, 00:16:08.257 "write": true, 00:16:08.257 "unmap": true, 00:16:08.257 "flush": true, 00:16:08.257 "reset": true, 00:16:08.257 "nvme_admin": false, 00:16:08.257 "nvme_io": false, 00:16:08.257 "nvme_io_md": false, 00:16:08.257 "write_zeroes": true, 00:16:08.257 "zcopy": true, 00:16:08.257 "get_zone_info": false, 00:16:08.257 "zone_management": false, 00:16:08.257 "zone_append": false, 00:16:08.257 "compare": false, 00:16:08.257 "compare_and_write": false, 00:16:08.257 "abort": true, 00:16:08.257 "seek_hole": false, 00:16:08.257 "seek_data": false, 00:16:08.257 "copy": true, 00:16:08.257 "nvme_iov_md": false 00:16:08.257 }, 00:16:08.257 "memory_domains": [ 00:16:08.257 { 00:16:08.257 "dma_device_id": "system", 00:16:08.257 "dma_device_type": 1 00:16:08.257 }, 00:16:08.257 { 00:16:08.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.257 "dma_device_type": 2 00:16:08.257 } 00:16:08.257 ], 00:16:08.257 "driver_specific": {} 00:16:08.257 } 00:16:08.257 ] 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.257 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.517 "name": "Existed_Raid", 00:16:08.517 "uuid": "abe8ca4e-65ee-4eae-9537-fb6acad667e0", 00:16:08.517 "strip_size_kb": 64, 00:16:08.517 "state": "online", 00:16:08.517 "raid_level": "concat", 00:16:08.517 "superblock": false, 00:16:08.517 "num_base_bdevs": 4, 00:16:08.517 "num_base_bdevs_discovered": 4, 00:16:08.517 "num_base_bdevs_operational": 4, 00:16:08.517 "base_bdevs_list": [ 00:16:08.517 { 00:16:08.517 "name": "BaseBdev1", 00:16:08.517 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:08.517 "is_configured": true, 00:16:08.517 "data_offset": 0, 00:16:08.517 "data_size": 65536 00:16:08.517 }, 00:16:08.517 { 00:16:08.517 "name": "BaseBdev2", 00:16:08.517 "uuid": "6d90628a-9c78-424e-b7f4-737949225cb5", 00:16:08.517 "is_configured": true, 00:16:08.517 "data_offset": 0, 00:16:08.517 "data_size": 65536 00:16:08.517 }, 00:16:08.517 { 00:16:08.517 "name": "BaseBdev3", 00:16:08.517 "uuid": "8b1a1826-b0bb-4bbe-9798-ad667e4f862f", 00:16:08.517 "is_configured": true, 00:16:08.517 "data_offset": 0, 00:16:08.517 "data_size": 65536 00:16:08.517 }, 00:16:08.517 { 00:16:08.517 "name": "BaseBdev4", 00:16:08.517 "uuid": "46d79394-bba2-4b47-9160-a08a8bc04453", 00:16:08.517 "is_configured": true, 00:16:08.517 "data_offset": 0, 00:16:08.517 "data_size": 65536 00:16:08.517 } 00:16:08.517 ] 00:16:08.517 }' 00:16:08.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.517 13:38:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:09.083 [2024-07-15 13:38:56.624654] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:09.083 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:09.083 "name": "Existed_Raid", 00:16:09.083 "aliases": [ 00:16:09.083 "abe8ca4e-65ee-4eae-9537-fb6acad667e0" 00:16:09.083 ], 00:16:09.083 "product_name": "Raid Volume", 00:16:09.083 "block_size": 512, 00:16:09.083 "num_blocks": 262144, 00:16:09.083 "uuid": "abe8ca4e-65ee-4eae-9537-fb6acad667e0", 00:16:09.083 "assigned_rate_limits": { 00:16:09.083 "rw_ios_per_sec": 0, 00:16:09.083 "rw_mbytes_per_sec": 0, 00:16:09.083 "r_mbytes_per_sec": 0, 00:16:09.083 "w_mbytes_per_sec": 0 00:16:09.083 }, 00:16:09.083 "claimed": false, 00:16:09.083 "zoned": false, 00:16:09.083 "supported_io_types": { 00:16:09.083 "read": true, 00:16:09.083 "write": true, 00:16:09.083 "unmap": true, 00:16:09.083 "flush": true, 00:16:09.083 "reset": true, 00:16:09.083 "nvme_admin": false, 00:16:09.083 "nvme_io": false, 00:16:09.083 "nvme_io_md": false, 00:16:09.083 "write_zeroes": true, 00:16:09.083 "zcopy": false, 00:16:09.083 "get_zone_info": false, 00:16:09.083 "zone_management": false, 00:16:09.083 "zone_append": false, 00:16:09.083 "compare": false, 00:16:09.083 "compare_and_write": false, 00:16:09.083 "abort": false, 00:16:09.083 "seek_hole": false, 00:16:09.083 "seek_data": false, 00:16:09.083 "copy": false, 00:16:09.083 "nvme_iov_md": false 00:16:09.083 }, 00:16:09.083 "memory_domains": [ 00:16:09.083 { 00:16:09.083 "dma_device_id": "system", 00:16:09.083 "dma_device_type": 1 00:16:09.083 }, 00:16:09.083 { 00:16:09.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.083 "dma_device_type": 2 00:16:09.083 }, 00:16:09.083 { 00:16:09.083 "dma_device_id": "system", 00:16:09.083 "dma_device_type": 1 00:16:09.083 }, 00:16:09.083 { 00:16:09.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.083 "dma_device_type": 2 00:16:09.083 }, 00:16:09.083 { 00:16:09.083 "dma_device_id": "system", 00:16:09.083 "dma_device_type": 1 00:16:09.083 }, 00:16:09.083 { 00:16:09.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.083 "dma_device_type": 2 00:16:09.083 }, 00:16:09.083 { 00:16:09.083 "dma_device_id": "system", 00:16:09.083 "dma_device_type": 1 00:16:09.083 }, 00:16:09.083 { 00:16:09.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.083 "dma_device_type": 2 00:16:09.083 } 00:16:09.083 ], 00:16:09.083 "driver_specific": { 00:16:09.083 "raid": { 00:16:09.083 "uuid": "abe8ca4e-65ee-4eae-9537-fb6acad667e0", 00:16:09.083 "strip_size_kb": 64, 00:16:09.083 "state": "online", 00:16:09.083 "raid_level": "concat", 00:16:09.083 "superblock": false, 00:16:09.083 "num_base_bdevs": 4, 00:16:09.083 "num_base_bdevs_discovered": 4, 00:16:09.083 "num_base_bdevs_operational": 4, 00:16:09.083 "base_bdevs_list": [ 00:16:09.083 { 00:16:09.083 "name": "BaseBdev1", 00:16:09.083 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:09.083 "is_configured": true, 00:16:09.083 "data_offset": 0, 00:16:09.084 "data_size": 65536 00:16:09.084 }, 00:16:09.084 { 00:16:09.084 "name": "BaseBdev2", 00:16:09.084 "uuid": "6d90628a-9c78-424e-b7f4-737949225cb5", 00:16:09.084 "is_configured": true, 00:16:09.084 "data_offset": 0, 00:16:09.084 "data_size": 65536 00:16:09.084 }, 00:16:09.084 { 00:16:09.084 "name": "BaseBdev3", 00:16:09.084 "uuid": "8b1a1826-b0bb-4bbe-9798-ad667e4f862f", 00:16:09.084 "is_configured": true, 00:16:09.084 "data_offset": 0, 00:16:09.084 "data_size": 65536 00:16:09.084 }, 00:16:09.084 { 00:16:09.084 "name": "BaseBdev4", 00:16:09.084 "uuid": "46d79394-bba2-4b47-9160-a08a8bc04453", 00:16:09.084 "is_configured": true, 00:16:09.084 "data_offset": 0, 00:16:09.084 "data_size": 65536 00:16:09.084 } 00:16:09.084 ] 00:16:09.084 } 00:16:09.084 } 00:16:09.084 }' 00:16:09.084 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:09.084 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:09.084 BaseBdev2 00:16:09.084 BaseBdev3 00:16:09.084 BaseBdev4' 00:16:09.084 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:09.084 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:09.084 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:09.341 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:09.342 "name": "BaseBdev1", 00:16:09.342 "aliases": [ 00:16:09.342 "7802b948-486a-4fba-bf73-06b13bf27db9" 00:16:09.342 ], 00:16:09.342 "product_name": "Malloc disk", 00:16:09.342 "block_size": 512, 00:16:09.342 "num_blocks": 65536, 00:16:09.342 "uuid": "7802b948-486a-4fba-bf73-06b13bf27db9", 00:16:09.342 "assigned_rate_limits": { 00:16:09.342 "rw_ios_per_sec": 0, 00:16:09.342 "rw_mbytes_per_sec": 0, 00:16:09.342 "r_mbytes_per_sec": 0, 00:16:09.342 "w_mbytes_per_sec": 0 00:16:09.342 }, 00:16:09.342 "claimed": true, 00:16:09.342 "claim_type": "exclusive_write", 00:16:09.342 "zoned": false, 00:16:09.342 "supported_io_types": { 00:16:09.342 "read": true, 00:16:09.342 "write": true, 00:16:09.342 "unmap": true, 00:16:09.342 "flush": true, 00:16:09.342 "reset": true, 00:16:09.342 "nvme_admin": false, 00:16:09.342 "nvme_io": false, 00:16:09.342 "nvme_io_md": false, 00:16:09.342 "write_zeroes": true, 00:16:09.342 "zcopy": true, 00:16:09.342 "get_zone_info": false, 00:16:09.342 "zone_management": false, 00:16:09.342 "zone_append": false, 00:16:09.342 "compare": false, 00:16:09.342 "compare_and_write": false, 00:16:09.342 "abort": true, 00:16:09.342 "seek_hole": false, 00:16:09.342 "seek_data": false, 00:16:09.342 "copy": true, 00:16:09.342 "nvme_iov_md": false 00:16:09.342 }, 00:16:09.342 "memory_domains": [ 00:16:09.342 { 00:16:09.342 "dma_device_id": "system", 00:16:09.342 "dma_device_type": 1 00:16:09.342 }, 00:16:09.342 { 00:16:09.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.342 "dma_device_type": 2 00:16:09.342 } 00:16:09.342 ], 00:16:09.342 "driver_specific": {} 00:16:09.342 }' 00:16:09.342 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.342 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.342 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:09.342 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.600 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:09.600 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:09.858 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:09.858 "name": "BaseBdev2", 00:16:09.858 "aliases": [ 00:16:09.858 "6d90628a-9c78-424e-b7f4-737949225cb5" 00:16:09.858 ], 00:16:09.858 "product_name": "Malloc disk", 00:16:09.858 "block_size": 512, 00:16:09.858 "num_blocks": 65536, 00:16:09.858 "uuid": "6d90628a-9c78-424e-b7f4-737949225cb5", 00:16:09.858 "assigned_rate_limits": { 00:16:09.858 "rw_ios_per_sec": 0, 00:16:09.858 "rw_mbytes_per_sec": 0, 00:16:09.858 "r_mbytes_per_sec": 0, 00:16:09.858 "w_mbytes_per_sec": 0 00:16:09.858 }, 00:16:09.858 "claimed": true, 00:16:09.858 "claim_type": "exclusive_write", 00:16:09.858 "zoned": false, 00:16:09.858 "supported_io_types": { 00:16:09.858 "read": true, 00:16:09.858 "write": true, 00:16:09.858 "unmap": true, 00:16:09.858 "flush": true, 00:16:09.858 "reset": true, 00:16:09.858 "nvme_admin": false, 00:16:09.858 "nvme_io": false, 00:16:09.858 "nvme_io_md": false, 00:16:09.858 "write_zeroes": true, 00:16:09.859 "zcopy": true, 00:16:09.859 "get_zone_info": false, 00:16:09.859 "zone_management": false, 00:16:09.859 "zone_append": false, 00:16:09.859 "compare": false, 00:16:09.859 "compare_and_write": false, 00:16:09.859 "abort": true, 00:16:09.859 "seek_hole": false, 00:16:09.859 "seek_data": false, 00:16:09.859 "copy": true, 00:16:09.859 "nvme_iov_md": false 00:16:09.859 }, 00:16:09.859 "memory_domains": [ 00:16:09.859 { 00:16:09.859 "dma_device_id": "system", 00:16:09.859 "dma_device_type": 1 00:16:09.859 }, 00:16:09.859 { 00:16:09.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.859 "dma_device_type": 2 00:16:09.859 } 00:16:09.859 ], 00:16:09.859 "driver_specific": {} 00:16:09.859 }' 00:16:09.859 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.859 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.859 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:09.859 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.859 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.859 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:09.859 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:10.116 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.375 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:10.375 "name": "BaseBdev3", 00:16:10.375 "aliases": [ 00:16:10.375 "8b1a1826-b0bb-4bbe-9798-ad667e4f862f" 00:16:10.375 ], 00:16:10.375 "product_name": "Malloc disk", 00:16:10.375 "block_size": 512, 00:16:10.375 "num_blocks": 65536, 00:16:10.375 "uuid": "8b1a1826-b0bb-4bbe-9798-ad667e4f862f", 00:16:10.375 "assigned_rate_limits": { 00:16:10.375 "rw_ios_per_sec": 0, 00:16:10.375 "rw_mbytes_per_sec": 0, 00:16:10.375 "r_mbytes_per_sec": 0, 00:16:10.375 "w_mbytes_per_sec": 0 00:16:10.375 }, 00:16:10.375 "claimed": true, 00:16:10.375 "claim_type": "exclusive_write", 00:16:10.375 "zoned": false, 00:16:10.375 "supported_io_types": { 00:16:10.375 "read": true, 00:16:10.375 "write": true, 00:16:10.375 "unmap": true, 00:16:10.376 "flush": true, 00:16:10.376 "reset": true, 00:16:10.376 "nvme_admin": false, 00:16:10.376 "nvme_io": false, 00:16:10.376 "nvme_io_md": false, 00:16:10.376 "write_zeroes": true, 00:16:10.376 "zcopy": true, 00:16:10.376 "get_zone_info": false, 00:16:10.376 "zone_management": false, 00:16:10.376 "zone_append": false, 00:16:10.376 "compare": false, 00:16:10.376 "compare_and_write": false, 00:16:10.376 "abort": true, 00:16:10.376 "seek_hole": false, 00:16:10.376 "seek_data": false, 00:16:10.376 "copy": true, 00:16:10.376 "nvme_iov_md": false 00:16:10.376 }, 00:16:10.376 "memory_domains": [ 00:16:10.376 { 00:16:10.376 "dma_device_id": "system", 00:16:10.376 "dma_device_type": 1 00:16:10.376 }, 00:16:10.376 { 00:16:10.376 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.376 "dma_device_type": 2 00:16:10.376 } 00:16:10.376 ], 00:16:10.376 "driver_specific": {} 00:16:10.376 }' 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.376 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.634 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:10.634 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.634 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.634 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:10.634 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.634 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:10.634 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.634 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:10.634 "name": "BaseBdev4", 00:16:10.634 "aliases": [ 00:16:10.634 "46d79394-bba2-4b47-9160-a08a8bc04453" 00:16:10.634 ], 00:16:10.634 "product_name": "Malloc disk", 00:16:10.634 "block_size": 512, 00:16:10.635 "num_blocks": 65536, 00:16:10.635 "uuid": "46d79394-bba2-4b47-9160-a08a8bc04453", 00:16:10.635 "assigned_rate_limits": { 00:16:10.635 "rw_ios_per_sec": 0, 00:16:10.635 "rw_mbytes_per_sec": 0, 00:16:10.635 "r_mbytes_per_sec": 0, 00:16:10.635 "w_mbytes_per_sec": 0 00:16:10.635 }, 00:16:10.635 "claimed": true, 00:16:10.635 "claim_type": "exclusive_write", 00:16:10.635 "zoned": false, 00:16:10.635 "supported_io_types": { 00:16:10.635 "read": true, 00:16:10.635 "write": true, 00:16:10.635 "unmap": true, 00:16:10.635 "flush": true, 00:16:10.635 "reset": true, 00:16:10.635 "nvme_admin": false, 00:16:10.635 "nvme_io": false, 00:16:10.635 "nvme_io_md": false, 00:16:10.635 "write_zeroes": true, 00:16:10.635 "zcopy": true, 00:16:10.635 "get_zone_info": false, 00:16:10.635 "zone_management": false, 00:16:10.635 "zone_append": false, 00:16:10.635 "compare": false, 00:16:10.635 "compare_and_write": false, 00:16:10.635 "abort": true, 00:16:10.635 "seek_hole": false, 00:16:10.635 "seek_data": false, 00:16:10.635 "copy": true, 00:16:10.635 "nvme_iov_md": false 00:16:10.635 }, 00:16:10.635 "memory_domains": [ 00:16:10.635 { 00:16:10.635 "dma_device_id": "system", 00:16:10.635 "dma_device_type": 1 00:16:10.635 }, 00:16:10.635 { 00:16:10.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.635 "dma_device_type": 2 00:16:10.635 } 00:16:10.635 ], 00:16:10.635 "driver_specific": {} 00:16:10.635 }' 00:16:10.635 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:10.893 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:11.152 [2024-07-15 13:38:58.633672] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:11.152 [2024-07-15 13:38:58.633697] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:11.152 [2024-07-15 13:38:58.633738] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.152 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.411 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.411 "name": "Existed_Raid", 00:16:11.411 "uuid": "abe8ca4e-65ee-4eae-9537-fb6acad667e0", 00:16:11.411 "strip_size_kb": 64, 00:16:11.411 "state": "offline", 00:16:11.411 "raid_level": "concat", 00:16:11.411 "superblock": false, 00:16:11.411 "num_base_bdevs": 4, 00:16:11.411 "num_base_bdevs_discovered": 3, 00:16:11.411 "num_base_bdevs_operational": 3, 00:16:11.411 "base_bdevs_list": [ 00:16:11.411 { 00:16:11.411 "name": null, 00:16:11.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.411 "is_configured": false, 00:16:11.411 "data_offset": 0, 00:16:11.411 "data_size": 65536 00:16:11.411 }, 00:16:11.411 { 00:16:11.411 "name": "BaseBdev2", 00:16:11.411 "uuid": "6d90628a-9c78-424e-b7f4-737949225cb5", 00:16:11.411 "is_configured": true, 00:16:11.411 "data_offset": 0, 00:16:11.411 "data_size": 65536 00:16:11.411 }, 00:16:11.411 { 00:16:11.411 "name": "BaseBdev3", 00:16:11.411 "uuid": "8b1a1826-b0bb-4bbe-9798-ad667e4f862f", 00:16:11.411 "is_configured": true, 00:16:11.411 "data_offset": 0, 00:16:11.411 "data_size": 65536 00:16:11.411 }, 00:16:11.411 { 00:16:11.411 "name": "BaseBdev4", 00:16:11.411 "uuid": "46d79394-bba2-4b47-9160-a08a8bc04453", 00:16:11.411 "is_configured": true, 00:16:11.411 "data_offset": 0, 00:16:11.411 "data_size": 65536 00:16:11.411 } 00:16:11.411 ] 00:16:11.411 }' 00:16:11.411 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.411 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.977 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:11.977 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:11.977 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.977 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:11.977 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:11.977 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:11.977 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:12.235 [2024-07-15 13:38:59.649046] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:12.235 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:12.235 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:12.235 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.235 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:12.235 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:12.235 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:12.235 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:12.494 [2024-07-15 13:38:59.999949] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:12.494 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:12.494 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:12.494 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.494 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:12.751 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:12.751 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:12.751 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:12.751 [2024-07-15 13:39:00.358951] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:12.751 [2024-07-15 13:39:00.358991] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cef840 name Existed_Raid, state offline 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:13.009 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:13.268 BaseBdev2 00:16:13.268 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:13.268 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:13.268 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:13.268 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:13.268 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:13.268 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:13.268 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:13.524 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:13.524 [ 00:16:13.524 { 00:16:13.524 "name": "BaseBdev2", 00:16:13.524 "aliases": [ 00:16:13.524 "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5" 00:16:13.524 ], 00:16:13.524 "product_name": "Malloc disk", 00:16:13.524 "block_size": 512, 00:16:13.524 "num_blocks": 65536, 00:16:13.524 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:13.524 "assigned_rate_limits": { 00:16:13.524 "rw_ios_per_sec": 0, 00:16:13.524 "rw_mbytes_per_sec": 0, 00:16:13.524 "r_mbytes_per_sec": 0, 00:16:13.524 "w_mbytes_per_sec": 0 00:16:13.524 }, 00:16:13.524 "claimed": false, 00:16:13.524 "zoned": false, 00:16:13.524 "supported_io_types": { 00:16:13.524 "read": true, 00:16:13.524 "write": true, 00:16:13.524 "unmap": true, 00:16:13.524 "flush": true, 00:16:13.525 "reset": true, 00:16:13.525 "nvme_admin": false, 00:16:13.525 "nvme_io": false, 00:16:13.525 "nvme_io_md": false, 00:16:13.525 "write_zeroes": true, 00:16:13.525 "zcopy": true, 00:16:13.525 "get_zone_info": false, 00:16:13.525 "zone_management": false, 00:16:13.525 "zone_append": false, 00:16:13.525 "compare": false, 00:16:13.525 "compare_and_write": false, 00:16:13.525 "abort": true, 00:16:13.525 "seek_hole": false, 00:16:13.525 "seek_data": false, 00:16:13.525 "copy": true, 00:16:13.525 "nvme_iov_md": false 00:16:13.525 }, 00:16:13.525 "memory_domains": [ 00:16:13.525 { 00:16:13.525 "dma_device_id": "system", 00:16:13.525 "dma_device_type": 1 00:16:13.525 }, 00:16:13.525 { 00:16:13.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.525 "dma_device_type": 2 00:16:13.525 } 00:16:13.525 ], 00:16:13.525 "driver_specific": {} 00:16:13.525 } 00:16:13.525 ] 00:16:13.525 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:13.525 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:13.525 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:13.525 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:13.781 BaseBdev3 00:16:13.781 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:13.781 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:13.781 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:13.781 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:13.781 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:13.781 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:13.781 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:14.039 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:14.039 [ 00:16:14.039 { 00:16:14.039 "name": "BaseBdev3", 00:16:14.039 "aliases": [ 00:16:14.039 "2cbaadc2-14b3-498f-be3c-9d613ae375c5" 00:16:14.039 ], 00:16:14.039 "product_name": "Malloc disk", 00:16:14.039 "block_size": 512, 00:16:14.039 "num_blocks": 65536, 00:16:14.039 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:14.039 "assigned_rate_limits": { 00:16:14.039 "rw_ios_per_sec": 0, 00:16:14.039 "rw_mbytes_per_sec": 0, 00:16:14.039 "r_mbytes_per_sec": 0, 00:16:14.039 "w_mbytes_per_sec": 0 00:16:14.039 }, 00:16:14.039 "claimed": false, 00:16:14.039 "zoned": false, 00:16:14.039 "supported_io_types": { 00:16:14.039 "read": true, 00:16:14.039 "write": true, 00:16:14.039 "unmap": true, 00:16:14.039 "flush": true, 00:16:14.039 "reset": true, 00:16:14.040 "nvme_admin": false, 00:16:14.040 "nvme_io": false, 00:16:14.040 "nvme_io_md": false, 00:16:14.040 "write_zeroes": true, 00:16:14.040 "zcopy": true, 00:16:14.040 "get_zone_info": false, 00:16:14.040 "zone_management": false, 00:16:14.040 "zone_append": false, 00:16:14.040 "compare": false, 00:16:14.040 "compare_and_write": false, 00:16:14.040 "abort": true, 00:16:14.040 "seek_hole": false, 00:16:14.040 "seek_data": false, 00:16:14.040 "copy": true, 00:16:14.040 "nvme_iov_md": false 00:16:14.040 }, 00:16:14.040 "memory_domains": [ 00:16:14.040 { 00:16:14.040 "dma_device_id": "system", 00:16:14.040 "dma_device_type": 1 00:16:14.040 }, 00:16:14.040 { 00:16:14.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.040 "dma_device_type": 2 00:16:14.040 } 00:16:14.040 ], 00:16:14.040 "driver_specific": {} 00:16:14.040 } 00:16:14.040 ] 00:16:14.040 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:14.040 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:14.040 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:14.040 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:14.298 BaseBdev4 00:16:14.298 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:14.298 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:14.298 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:14.298 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:14.298 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:14.298 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:14.298 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:14.554 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:14.554 [ 00:16:14.554 { 00:16:14.554 "name": "BaseBdev4", 00:16:14.554 "aliases": [ 00:16:14.554 "b700ed8a-bfc9-46ed-ad79-15514f96f4ff" 00:16:14.554 ], 00:16:14.554 "product_name": "Malloc disk", 00:16:14.554 "block_size": 512, 00:16:14.554 "num_blocks": 65536, 00:16:14.554 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:14.554 "assigned_rate_limits": { 00:16:14.554 "rw_ios_per_sec": 0, 00:16:14.554 "rw_mbytes_per_sec": 0, 00:16:14.554 "r_mbytes_per_sec": 0, 00:16:14.554 "w_mbytes_per_sec": 0 00:16:14.554 }, 00:16:14.554 "claimed": false, 00:16:14.554 "zoned": false, 00:16:14.554 "supported_io_types": { 00:16:14.554 "read": true, 00:16:14.554 "write": true, 00:16:14.554 "unmap": true, 00:16:14.554 "flush": true, 00:16:14.554 "reset": true, 00:16:14.554 "nvme_admin": false, 00:16:14.554 "nvme_io": false, 00:16:14.554 "nvme_io_md": false, 00:16:14.554 "write_zeroes": true, 00:16:14.554 "zcopy": true, 00:16:14.554 "get_zone_info": false, 00:16:14.554 "zone_management": false, 00:16:14.554 "zone_append": false, 00:16:14.554 "compare": false, 00:16:14.554 "compare_and_write": false, 00:16:14.554 "abort": true, 00:16:14.554 "seek_hole": false, 00:16:14.554 "seek_data": false, 00:16:14.554 "copy": true, 00:16:14.554 "nvme_iov_md": false 00:16:14.554 }, 00:16:14.554 "memory_domains": [ 00:16:14.554 { 00:16:14.554 "dma_device_id": "system", 00:16:14.554 "dma_device_type": 1 00:16:14.554 }, 00:16:14.554 { 00:16:14.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.554 "dma_device_type": 2 00:16:14.554 } 00:16:14.554 ], 00:16:14.554 "driver_specific": {} 00:16:14.554 } 00:16:14.554 ] 00:16:14.554 13:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:14.554 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:14.554 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:14.554 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:14.811 [2024-07-15 13:39:02.332267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:14.811 [2024-07-15 13:39:02.332305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:14.811 [2024-07-15 13:39:02.332320] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:14.811 [2024-07-15 13:39:02.333354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:14.811 [2024-07-15 13:39:02.333387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.811 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.069 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.069 "name": "Existed_Raid", 00:16:15.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.069 "strip_size_kb": 64, 00:16:15.069 "state": "configuring", 00:16:15.069 "raid_level": "concat", 00:16:15.069 "superblock": false, 00:16:15.069 "num_base_bdevs": 4, 00:16:15.069 "num_base_bdevs_discovered": 3, 00:16:15.069 "num_base_bdevs_operational": 4, 00:16:15.069 "base_bdevs_list": [ 00:16:15.069 { 00:16:15.069 "name": "BaseBdev1", 00:16:15.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.069 "is_configured": false, 00:16:15.069 "data_offset": 0, 00:16:15.069 "data_size": 0 00:16:15.069 }, 00:16:15.069 { 00:16:15.069 "name": "BaseBdev2", 00:16:15.069 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:15.069 "is_configured": true, 00:16:15.069 "data_offset": 0, 00:16:15.069 "data_size": 65536 00:16:15.069 }, 00:16:15.069 { 00:16:15.069 "name": "BaseBdev3", 00:16:15.069 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:15.069 "is_configured": true, 00:16:15.069 "data_offset": 0, 00:16:15.069 "data_size": 65536 00:16:15.069 }, 00:16:15.069 { 00:16:15.069 "name": "BaseBdev4", 00:16:15.069 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:15.069 "is_configured": true, 00:16:15.069 "data_offset": 0, 00:16:15.069 "data_size": 65536 00:16:15.069 } 00:16:15.069 ] 00:16:15.069 }' 00:16:15.069 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.069 13:39:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.633 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:15.633 [2024-07-15 13:39:03.194479] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:15.633 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.634 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.890 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.890 "name": "Existed_Raid", 00:16:15.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.890 "strip_size_kb": 64, 00:16:15.890 "state": "configuring", 00:16:15.890 "raid_level": "concat", 00:16:15.890 "superblock": false, 00:16:15.890 "num_base_bdevs": 4, 00:16:15.890 "num_base_bdevs_discovered": 2, 00:16:15.890 "num_base_bdevs_operational": 4, 00:16:15.890 "base_bdevs_list": [ 00:16:15.890 { 00:16:15.890 "name": "BaseBdev1", 00:16:15.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.890 "is_configured": false, 00:16:15.890 "data_offset": 0, 00:16:15.890 "data_size": 0 00:16:15.890 }, 00:16:15.890 { 00:16:15.890 "name": null, 00:16:15.890 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:15.890 "is_configured": false, 00:16:15.890 "data_offset": 0, 00:16:15.890 "data_size": 65536 00:16:15.890 }, 00:16:15.890 { 00:16:15.890 "name": "BaseBdev3", 00:16:15.890 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:15.890 "is_configured": true, 00:16:15.890 "data_offset": 0, 00:16:15.890 "data_size": 65536 00:16:15.890 }, 00:16:15.890 { 00:16:15.890 "name": "BaseBdev4", 00:16:15.890 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:15.890 "is_configured": true, 00:16:15.890 "data_offset": 0, 00:16:15.890 "data_size": 65536 00:16:15.890 } 00:16:15.890 ] 00:16:15.890 }' 00:16:15.890 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.890 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.455 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.455 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:16.751 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:16.751 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:16.751 [2024-07-15 13:39:04.265427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:16.751 BaseBdev1 00:16:16.751 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:16.751 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:16.751 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:16.751 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:16.751 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:16.752 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:16.752 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:17.059 [ 00:16:17.059 { 00:16:17.059 "name": "BaseBdev1", 00:16:17.059 "aliases": [ 00:16:17.059 "d4544725-42a4-4d65-8d56-697f768c5b4d" 00:16:17.059 ], 00:16:17.059 "product_name": "Malloc disk", 00:16:17.059 "block_size": 512, 00:16:17.059 "num_blocks": 65536, 00:16:17.059 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:17.059 "assigned_rate_limits": { 00:16:17.059 "rw_ios_per_sec": 0, 00:16:17.059 "rw_mbytes_per_sec": 0, 00:16:17.059 "r_mbytes_per_sec": 0, 00:16:17.059 "w_mbytes_per_sec": 0 00:16:17.059 }, 00:16:17.059 "claimed": true, 00:16:17.059 "claim_type": "exclusive_write", 00:16:17.059 "zoned": false, 00:16:17.059 "supported_io_types": { 00:16:17.059 "read": true, 00:16:17.059 "write": true, 00:16:17.059 "unmap": true, 00:16:17.059 "flush": true, 00:16:17.059 "reset": true, 00:16:17.059 "nvme_admin": false, 00:16:17.059 "nvme_io": false, 00:16:17.059 "nvme_io_md": false, 00:16:17.059 "write_zeroes": true, 00:16:17.059 "zcopy": true, 00:16:17.059 "get_zone_info": false, 00:16:17.059 "zone_management": false, 00:16:17.059 "zone_append": false, 00:16:17.059 "compare": false, 00:16:17.059 "compare_and_write": false, 00:16:17.059 "abort": true, 00:16:17.059 "seek_hole": false, 00:16:17.059 "seek_data": false, 00:16:17.059 "copy": true, 00:16:17.059 "nvme_iov_md": false 00:16:17.059 }, 00:16:17.059 "memory_domains": [ 00:16:17.059 { 00:16:17.059 "dma_device_id": "system", 00:16:17.059 "dma_device_type": 1 00:16:17.059 }, 00:16:17.059 { 00:16:17.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.059 "dma_device_type": 2 00:16:17.059 } 00:16:17.059 ], 00:16:17.059 "driver_specific": {} 00:16:17.059 } 00:16:17.059 ] 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.059 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.318 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.318 "name": "Existed_Raid", 00:16:17.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.318 "strip_size_kb": 64, 00:16:17.318 "state": "configuring", 00:16:17.318 "raid_level": "concat", 00:16:17.318 "superblock": false, 00:16:17.318 "num_base_bdevs": 4, 00:16:17.318 "num_base_bdevs_discovered": 3, 00:16:17.318 "num_base_bdevs_operational": 4, 00:16:17.318 "base_bdevs_list": [ 00:16:17.318 { 00:16:17.318 "name": "BaseBdev1", 00:16:17.318 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:17.318 "is_configured": true, 00:16:17.318 "data_offset": 0, 00:16:17.318 "data_size": 65536 00:16:17.318 }, 00:16:17.318 { 00:16:17.318 "name": null, 00:16:17.318 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:17.318 "is_configured": false, 00:16:17.318 "data_offset": 0, 00:16:17.318 "data_size": 65536 00:16:17.318 }, 00:16:17.318 { 00:16:17.318 "name": "BaseBdev3", 00:16:17.318 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:17.318 "is_configured": true, 00:16:17.318 "data_offset": 0, 00:16:17.318 "data_size": 65536 00:16:17.318 }, 00:16:17.318 { 00:16:17.318 "name": "BaseBdev4", 00:16:17.318 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:17.318 "is_configured": true, 00:16:17.318 "data_offset": 0, 00:16:17.318 "data_size": 65536 00:16:17.318 } 00:16:17.318 ] 00:16:17.318 }' 00:16:17.318 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.318 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.883 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.883 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:18.141 [2024-07-15 13:39:05.665085] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.141 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.399 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.399 "name": "Existed_Raid", 00:16:18.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.399 "strip_size_kb": 64, 00:16:18.399 "state": "configuring", 00:16:18.399 "raid_level": "concat", 00:16:18.399 "superblock": false, 00:16:18.399 "num_base_bdevs": 4, 00:16:18.399 "num_base_bdevs_discovered": 2, 00:16:18.399 "num_base_bdevs_operational": 4, 00:16:18.399 "base_bdevs_list": [ 00:16:18.399 { 00:16:18.399 "name": "BaseBdev1", 00:16:18.399 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:18.399 "is_configured": true, 00:16:18.399 "data_offset": 0, 00:16:18.399 "data_size": 65536 00:16:18.399 }, 00:16:18.399 { 00:16:18.399 "name": null, 00:16:18.399 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:18.399 "is_configured": false, 00:16:18.399 "data_offset": 0, 00:16:18.399 "data_size": 65536 00:16:18.399 }, 00:16:18.399 { 00:16:18.399 "name": null, 00:16:18.399 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:18.399 "is_configured": false, 00:16:18.399 "data_offset": 0, 00:16:18.399 "data_size": 65536 00:16:18.399 }, 00:16:18.399 { 00:16:18.399 "name": "BaseBdev4", 00:16:18.399 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:18.399 "is_configured": true, 00:16:18.399 "data_offset": 0, 00:16:18.399 "data_size": 65536 00:16:18.399 } 00:16:18.399 ] 00:16:18.399 }' 00:16:18.399 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.399 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.965 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.965 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:18.965 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:18.965 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:19.223 [2024-07-15 13:39:06.703765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.223 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.482 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.482 "name": "Existed_Raid", 00:16:19.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.482 "strip_size_kb": 64, 00:16:19.482 "state": "configuring", 00:16:19.482 "raid_level": "concat", 00:16:19.482 "superblock": false, 00:16:19.482 "num_base_bdevs": 4, 00:16:19.482 "num_base_bdevs_discovered": 3, 00:16:19.482 "num_base_bdevs_operational": 4, 00:16:19.482 "base_bdevs_list": [ 00:16:19.482 { 00:16:19.482 "name": "BaseBdev1", 00:16:19.482 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:19.482 "is_configured": true, 00:16:19.482 "data_offset": 0, 00:16:19.482 "data_size": 65536 00:16:19.482 }, 00:16:19.482 { 00:16:19.482 "name": null, 00:16:19.482 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:19.482 "is_configured": false, 00:16:19.482 "data_offset": 0, 00:16:19.482 "data_size": 65536 00:16:19.482 }, 00:16:19.482 { 00:16:19.482 "name": "BaseBdev3", 00:16:19.482 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:19.482 "is_configured": true, 00:16:19.482 "data_offset": 0, 00:16:19.482 "data_size": 65536 00:16:19.482 }, 00:16:19.482 { 00:16:19.482 "name": "BaseBdev4", 00:16:19.482 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:19.482 "is_configured": true, 00:16:19.482 "data_offset": 0, 00:16:19.482 "data_size": 65536 00:16:19.482 } 00:16:19.482 ] 00:16:19.482 }' 00:16:19.482 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.482 13:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.049 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.049 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:20.049 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:20.049 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:20.307 [2024-07-15 13:39:07.706418] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.307 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.307 "name": "Existed_Raid", 00:16:20.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.307 "strip_size_kb": 64, 00:16:20.307 "state": "configuring", 00:16:20.307 "raid_level": "concat", 00:16:20.307 "superblock": false, 00:16:20.307 "num_base_bdevs": 4, 00:16:20.307 "num_base_bdevs_discovered": 2, 00:16:20.307 "num_base_bdevs_operational": 4, 00:16:20.307 "base_bdevs_list": [ 00:16:20.307 { 00:16:20.307 "name": null, 00:16:20.307 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:20.307 "is_configured": false, 00:16:20.307 "data_offset": 0, 00:16:20.307 "data_size": 65536 00:16:20.307 }, 00:16:20.307 { 00:16:20.307 "name": null, 00:16:20.307 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:20.307 "is_configured": false, 00:16:20.307 "data_offset": 0, 00:16:20.307 "data_size": 65536 00:16:20.307 }, 00:16:20.307 { 00:16:20.307 "name": "BaseBdev3", 00:16:20.307 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:20.307 "is_configured": true, 00:16:20.307 "data_offset": 0, 00:16:20.307 "data_size": 65536 00:16:20.307 }, 00:16:20.307 { 00:16:20.307 "name": "BaseBdev4", 00:16:20.307 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:20.307 "is_configured": true, 00:16:20.307 "data_offset": 0, 00:16:20.307 "data_size": 65536 00:16:20.307 } 00:16:20.307 ] 00:16:20.307 }' 00:16:20.308 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.308 13:39:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.875 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:20.875 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.132 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:21.132 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:21.132 [2024-07-15 13:39:08.740979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.390 "name": "Existed_Raid", 00:16:21.390 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:21.390 "strip_size_kb": 64, 00:16:21.390 "state": "configuring", 00:16:21.390 "raid_level": "concat", 00:16:21.390 "superblock": false, 00:16:21.390 "num_base_bdevs": 4, 00:16:21.390 "num_base_bdevs_discovered": 3, 00:16:21.390 "num_base_bdevs_operational": 4, 00:16:21.390 "base_bdevs_list": [ 00:16:21.390 { 00:16:21.390 "name": null, 00:16:21.390 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:21.390 "is_configured": false, 00:16:21.390 "data_offset": 0, 00:16:21.390 "data_size": 65536 00:16:21.390 }, 00:16:21.390 { 00:16:21.390 "name": "BaseBdev2", 00:16:21.390 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:21.390 "is_configured": true, 00:16:21.390 "data_offset": 0, 00:16:21.390 "data_size": 65536 00:16:21.390 }, 00:16:21.390 { 00:16:21.390 "name": "BaseBdev3", 00:16:21.390 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:21.390 "is_configured": true, 00:16:21.390 "data_offset": 0, 00:16:21.390 "data_size": 65536 00:16:21.390 }, 00:16:21.390 { 00:16:21.390 "name": "BaseBdev4", 00:16:21.390 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:21.390 "is_configured": true, 00:16:21.390 "data_offset": 0, 00:16:21.390 "data_size": 65536 00:16:21.390 } 00:16:21.390 ] 00:16:21.390 }' 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.390 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.955 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.955 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:22.214 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:22.214 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.214 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:22.214 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d4544725-42a4-4d65-8d56-697f768c5b4d 00:16:22.471 [2024-07-15 13:39:09.938962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:22.471 [2024-07-15 13:39:09.939006] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cef420 00:16:22.471 [2024-07-15 13:39:09.939012] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:22.471 [2024-07-15 13:39:09.939173] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf4450 00:16:22.471 [2024-07-15 13:39:09.939257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cef420 00:16:22.471 [2024-07-15 13:39:09.939264] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cef420 00:16:22.471 [2024-07-15 13:39:09.939387] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:22.471 NewBaseBdev 00:16:22.471 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:22.471 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:22.471 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:22.471 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:22.471 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:22.471 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:22.471 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:22.729 [ 00:16:22.729 { 00:16:22.729 "name": "NewBaseBdev", 00:16:22.729 "aliases": [ 00:16:22.729 "d4544725-42a4-4d65-8d56-697f768c5b4d" 00:16:22.729 ], 00:16:22.729 "product_name": "Malloc disk", 00:16:22.729 "block_size": 512, 00:16:22.729 "num_blocks": 65536, 00:16:22.729 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:22.729 "assigned_rate_limits": { 00:16:22.729 "rw_ios_per_sec": 0, 00:16:22.729 "rw_mbytes_per_sec": 0, 00:16:22.729 "r_mbytes_per_sec": 0, 00:16:22.729 "w_mbytes_per_sec": 0 00:16:22.729 }, 00:16:22.729 "claimed": true, 00:16:22.729 "claim_type": "exclusive_write", 00:16:22.729 "zoned": false, 00:16:22.729 "supported_io_types": { 00:16:22.729 "read": true, 00:16:22.729 "write": true, 00:16:22.729 "unmap": true, 00:16:22.729 "flush": true, 00:16:22.729 "reset": true, 00:16:22.729 "nvme_admin": false, 00:16:22.729 "nvme_io": false, 00:16:22.729 "nvme_io_md": false, 00:16:22.729 "write_zeroes": true, 00:16:22.729 "zcopy": true, 00:16:22.729 "get_zone_info": false, 00:16:22.729 "zone_management": false, 00:16:22.729 "zone_append": false, 00:16:22.729 "compare": false, 00:16:22.729 "compare_and_write": false, 00:16:22.729 "abort": true, 00:16:22.729 "seek_hole": false, 00:16:22.729 "seek_data": false, 00:16:22.729 "copy": true, 00:16:22.729 "nvme_iov_md": false 00:16:22.729 }, 00:16:22.729 "memory_domains": [ 00:16:22.729 { 00:16:22.729 "dma_device_id": "system", 00:16:22.729 "dma_device_type": 1 00:16:22.729 }, 00:16:22.729 { 00:16:22.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.729 "dma_device_type": 2 00:16:22.729 } 00:16:22.729 ], 00:16:22.729 "driver_specific": {} 00:16:22.729 } 00:16:22.729 ] 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.729 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.987 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.987 "name": "Existed_Raid", 00:16:22.987 "uuid": "3de913fd-c388-40ea-b058-fcd09572ae60", 00:16:22.987 "strip_size_kb": 64, 00:16:22.987 "state": "online", 00:16:22.987 "raid_level": "concat", 00:16:22.987 "superblock": false, 00:16:22.987 "num_base_bdevs": 4, 00:16:22.987 "num_base_bdevs_discovered": 4, 00:16:22.987 "num_base_bdevs_operational": 4, 00:16:22.987 "base_bdevs_list": [ 00:16:22.987 { 00:16:22.987 "name": "NewBaseBdev", 00:16:22.987 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:22.987 "is_configured": true, 00:16:22.987 "data_offset": 0, 00:16:22.987 "data_size": 65536 00:16:22.987 }, 00:16:22.987 { 00:16:22.987 "name": "BaseBdev2", 00:16:22.987 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:22.987 "is_configured": true, 00:16:22.987 "data_offset": 0, 00:16:22.987 "data_size": 65536 00:16:22.987 }, 00:16:22.987 { 00:16:22.987 "name": "BaseBdev3", 00:16:22.987 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:22.987 "is_configured": true, 00:16:22.987 "data_offset": 0, 00:16:22.987 "data_size": 65536 00:16:22.987 }, 00:16:22.987 { 00:16:22.987 "name": "BaseBdev4", 00:16:22.987 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:22.987 "is_configured": true, 00:16:22.987 "data_offset": 0, 00:16:22.987 "data_size": 65536 00:16:22.987 } 00:16:22.987 ] 00:16:22.987 }' 00:16:22.987 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.987 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:23.552 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:23.552 [2024-07-15 13:39:11.146320] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:23.552 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:23.552 "name": "Existed_Raid", 00:16:23.552 "aliases": [ 00:16:23.552 "3de913fd-c388-40ea-b058-fcd09572ae60" 00:16:23.552 ], 00:16:23.552 "product_name": "Raid Volume", 00:16:23.552 "block_size": 512, 00:16:23.552 "num_blocks": 262144, 00:16:23.552 "uuid": "3de913fd-c388-40ea-b058-fcd09572ae60", 00:16:23.552 "assigned_rate_limits": { 00:16:23.552 "rw_ios_per_sec": 0, 00:16:23.552 "rw_mbytes_per_sec": 0, 00:16:23.552 "r_mbytes_per_sec": 0, 00:16:23.552 "w_mbytes_per_sec": 0 00:16:23.552 }, 00:16:23.552 "claimed": false, 00:16:23.552 "zoned": false, 00:16:23.552 "supported_io_types": { 00:16:23.552 "read": true, 00:16:23.552 "write": true, 00:16:23.552 "unmap": true, 00:16:23.552 "flush": true, 00:16:23.552 "reset": true, 00:16:23.552 "nvme_admin": false, 00:16:23.552 "nvme_io": false, 00:16:23.552 "nvme_io_md": false, 00:16:23.552 "write_zeroes": true, 00:16:23.552 "zcopy": false, 00:16:23.552 "get_zone_info": false, 00:16:23.552 "zone_management": false, 00:16:23.552 "zone_append": false, 00:16:23.552 "compare": false, 00:16:23.552 "compare_and_write": false, 00:16:23.552 "abort": false, 00:16:23.552 "seek_hole": false, 00:16:23.552 "seek_data": false, 00:16:23.552 "copy": false, 00:16:23.552 "nvme_iov_md": false 00:16:23.552 }, 00:16:23.552 "memory_domains": [ 00:16:23.552 { 00:16:23.552 "dma_device_id": "system", 00:16:23.552 "dma_device_type": 1 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.552 "dma_device_type": 2 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "dma_device_id": "system", 00:16:23.552 "dma_device_type": 1 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.552 "dma_device_type": 2 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "dma_device_id": "system", 00:16:23.552 "dma_device_type": 1 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.552 "dma_device_type": 2 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "dma_device_id": "system", 00:16:23.552 "dma_device_type": 1 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.552 "dma_device_type": 2 00:16:23.552 } 00:16:23.552 ], 00:16:23.552 "driver_specific": { 00:16:23.552 "raid": { 00:16:23.552 "uuid": "3de913fd-c388-40ea-b058-fcd09572ae60", 00:16:23.552 "strip_size_kb": 64, 00:16:23.552 "state": "online", 00:16:23.552 "raid_level": "concat", 00:16:23.552 "superblock": false, 00:16:23.552 "num_base_bdevs": 4, 00:16:23.552 "num_base_bdevs_discovered": 4, 00:16:23.552 "num_base_bdevs_operational": 4, 00:16:23.552 "base_bdevs_list": [ 00:16:23.552 { 00:16:23.552 "name": "NewBaseBdev", 00:16:23.552 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:23.552 "is_configured": true, 00:16:23.552 "data_offset": 0, 00:16:23.552 "data_size": 65536 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "name": "BaseBdev2", 00:16:23.552 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:23.552 "is_configured": true, 00:16:23.552 "data_offset": 0, 00:16:23.552 "data_size": 65536 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "name": "BaseBdev3", 00:16:23.552 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:23.552 "is_configured": true, 00:16:23.552 "data_offset": 0, 00:16:23.552 "data_size": 65536 00:16:23.552 }, 00:16:23.552 { 00:16:23.552 "name": "BaseBdev4", 00:16:23.552 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:23.552 "is_configured": true, 00:16:23.552 "data_offset": 0, 00:16:23.552 "data_size": 65536 00:16:23.552 } 00:16:23.552 ] 00:16:23.552 } 00:16:23.552 } 00:16:23.552 }' 00:16:23.552 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:23.808 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:23.808 BaseBdev2 00:16:23.808 BaseBdev3 00:16:23.808 BaseBdev4' 00:16:23.808 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:23.808 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:23.808 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.808 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.808 "name": "NewBaseBdev", 00:16:23.808 "aliases": [ 00:16:23.808 "d4544725-42a4-4d65-8d56-697f768c5b4d" 00:16:23.808 ], 00:16:23.808 "product_name": "Malloc disk", 00:16:23.808 "block_size": 512, 00:16:23.808 "num_blocks": 65536, 00:16:23.808 "uuid": "d4544725-42a4-4d65-8d56-697f768c5b4d", 00:16:23.808 "assigned_rate_limits": { 00:16:23.808 "rw_ios_per_sec": 0, 00:16:23.808 "rw_mbytes_per_sec": 0, 00:16:23.808 "r_mbytes_per_sec": 0, 00:16:23.808 "w_mbytes_per_sec": 0 00:16:23.808 }, 00:16:23.808 "claimed": true, 00:16:23.808 "claim_type": "exclusive_write", 00:16:23.808 "zoned": false, 00:16:23.808 "supported_io_types": { 00:16:23.808 "read": true, 00:16:23.808 "write": true, 00:16:23.808 "unmap": true, 00:16:23.808 "flush": true, 00:16:23.808 "reset": true, 00:16:23.808 "nvme_admin": false, 00:16:23.808 "nvme_io": false, 00:16:23.808 "nvme_io_md": false, 00:16:23.808 "write_zeroes": true, 00:16:23.808 "zcopy": true, 00:16:23.808 "get_zone_info": false, 00:16:23.808 "zone_management": false, 00:16:23.808 "zone_append": false, 00:16:23.808 "compare": false, 00:16:23.808 "compare_and_write": false, 00:16:23.808 "abort": true, 00:16:23.808 "seek_hole": false, 00:16:23.808 "seek_data": false, 00:16:23.808 "copy": true, 00:16:23.808 "nvme_iov_md": false 00:16:23.808 }, 00:16:23.808 "memory_domains": [ 00:16:23.808 { 00:16:23.808 "dma_device_id": "system", 00:16:23.808 "dma_device_type": 1 00:16:23.808 }, 00:16:23.808 { 00:16:23.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.808 "dma_device_type": 2 00:16:23.808 } 00:16:23.808 ], 00:16:23.808 "driver_specific": {} 00:16:23.808 }' 00:16:23.808 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.808 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.064 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.320 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:24.320 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.320 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:24.320 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:24.320 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:24.320 "name": "BaseBdev2", 00:16:24.320 "aliases": [ 00:16:24.320 "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5" 00:16:24.320 ], 00:16:24.320 "product_name": "Malloc disk", 00:16:24.320 "block_size": 512, 00:16:24.320 "num_blocks": 65536, 00:16:24.320 "uuid": "d743d72e-1e60-4880-a2c8-1b64f9bbb4a5", 00:16:24.320 "assigned_rate_limits": { 00:16:24.320 "rw_ios_per_sec": 0, 00:16:24.320 "rw_mbytes_per_sec": 0, 00:16:24.320 "r_mbytes_per_sec": 0, 00:16:24.320 "w_mbytes_per_sec": 0 00:16:24.320 }, 00:16:24.320 "claimed": true, 00:16:24.320 "claim_type": "exclusive_write", 00:16:24.320 "zoned": false, 00:16:24.320 "supported_io_types": { 00:16:24.320 "read": true, 00:16:24.320 "write": true, 00:16:24.320 "unmap": true, 00:16:24.320 "flush": true, 00:16:24.320 "reset": true, 00:16:24.320 "nvme_admin": false, 00:16:24.320 "nvme_io": false, 00:16:24.320 "nvme_io_md": false, 00:16:24.320 "write_zeroes": true, 00:16:24.320 "zcopy": true, 00:16:24.320 "get_zone_info": false, 00:16:24.320 "zone_management": false, 00:16:24.320 "zone_append": false, 00:16:24.320 "compare": false, 00:16:24.320 "compare_and_write": false, 00:16:24.320 "abort": true, 00:16:24.320 "seek_hole": false, 00:16:24.320 "seek_data": false, 00:16:24.320 "copy": true, 00:16:24.320 "nvme_iov_md": false 00:16:24.320 }, 00:16:24.320 "memory_domains": [ 00:16:24.320 { 00:16:24.320 "dma_device_id": "system", 00:16:24.320 "dma_device_type": 1 00:16:24.320 }, 00:16:24.320 { 00:16:24.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.320 "dma_device_type": 2 00:16:24.320 } 00:16:24.320 ], 00:16:24.320 "driver_specific": {} 00:16:24.320 }' 00:16:24.320 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.320 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.577 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.577 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.577 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:24.577 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:24.834 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:24.834 "name": "BaseBdev3", 00:16:24.834 "aliases": [ 00:16:24.834 "2cbaadc2-14b3-498f-be3c-9d613ae375c5" 00:16:24.834 ], 00:16:24.834 "product_name": "Malloc disk", 00:16:24.834 "block_size": 512, 00:16:24.834 "num_blocks": 65536, 00:16:24.834 "uuid": "2cbaadc2-14b3-498f-be3c-9d613ae375c5", 00:16:24.834 "assigned_rate_limits": { 00:16:24.834 "rw_ios_per_sec": 0, 00:16:24.834 "rw_mbytes_per_sec": 0, 00:16:24.834 "r_mbytes_per_sec": 0, 00:16:24.834 "w_mbytes_per_sec": 0 00:16:24.834 }, 00:16:24.834 "claimed": true, 00:16:24.834 "claim_type": "exclusive_write", 00:16:24.834 "zoned": false, 00:16:24.834 "supported_io_types": { 00:16:24.834 "read": true, 00:16:24.834 "write": true, 00:16:24.834 "unmap": true, 00:16:24.834 "flush": true, 00:16:24.834 "reset": true, 00:16:24.834 "nvme_admin": false, 00:16:24.834 "nvme_io": false, 00:16:24.834 "nvme_io_md": false, 00:16:24.834 "write_zeroes": true, 00:16:24.834 "zcopy": true, 00:16:24.834 "get_zone_info": false, 00:16:24.834 "zone_management": false, 00:16:24.834 "zone_append": false, 00:16:24.834 "compare": false, 00:16:24.834 "compare_and_write": false, 00:16:24.834 "abort": true, 00:16:24.834 "seek_hole": false, 00:16:24.834 "seek_data": false, 00:16:24.834 "copy": true, 00:16:24.834 "nvme_iov_md": false 00:16:24.834 }, 00:16:24.834 "memory_domains": [ 00:16:24.834 { 00:16:24.834 "dma_device_id": "system", 00:16:24.834 "dma_device_type": 1 00:16:24.834 }, 00:16:24.834 { 00:16:24.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.834 "dma_device_type": 2 00:16:24.834 } 00:16:24.834 ], 00:16:24.834 "driver_specific": {} 00:16:24.834 }' 00:16:24.834 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.834 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.834 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.834 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.834 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:25.091 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.348 "name": "BaseBdev4", 00:16:25.348 "aliases": [ 00:16:25.348 "b700ed8a-bfc9-46ed-ad79-15514f96f4ff" 00:16:25.348 ], 00:16:25.348 "product_name": "Malloc disk", 00:16:25.348 "block_size": 512, 00:16:25.348 "num_blocks": 65536, 00:16:25.348 "uuid": "b700ed8a-bfc9-46ed-ad79-15514f96f4ff", 00:16:25.348 "assigned_rate_limits": { 00:16:25.348 "rw_ios_per_sec": 0, 00:16:25.348 "rw_mbytes_per_sec": 0, 00:16:25.348 "r_mbytes_per_sec": 0, 00:16:25.348 "w_mbytes_per_sec": 0 00:16:25.348 }, 00:16:25.348 "claimed": true, 00:16:25.348 "claim_type": "exclusive_write", 00:16:25.348 "zoned": false, 00:16:25.348 "supported_io_types": { 00:16:25.348 "read": true, 00:16:25.348 "write": true, 00:16:25.348 "unmap": true, 00:16:25.348 "flush": true, 00:16:25.348 "reset": true, 00:16:25.348 "nvme_admin": false, 00:16:25.348 "nvme_io": false, 00:16:25.348 "nvme_io_md": false, 00:16:25.348 "write_zeroes": true, 00:16:25.348 "zcopy": true, 00:16:25.348 "get_zone_info": false, 00:16:25.348 "zone_management": false, 00:16:25.348 "zone_append": false, 00:16:25.348 "compare": false, 00:16:25.348 "compare_and_write": false, 00:16:25.348 "abort": true, 00:16:25.348 "seek_hole": false, 00:16:25.348 "seek_data": false, 00:16:25.348 "copy": true, 00:16:25.348 "nvme_iov_md": false 00:16:25.348 }, 00:16:25.348 "memory_domains": [ 00:16:25.348 { 00:16:25.348 "dma_device_id": "system", 00:16:25.348 "dma_device_type": 1 00:16:25.348 }, 00:16:25.348 { 00:16:25.348 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.348 "dma_device_type": 2 00:16:25.348 } 00:16:25.348 ], 00:16:25.348 "driver_specific": {} 00:16:25.348 }' 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.348 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.604 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.604 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.604 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.604 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.604 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.604 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:25.861 [2024-07-15 13:39:13.251543] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:25.861 [2024-07-15 13:39:13.251569] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:25.861 [2024-07-15 13:39:13.251610] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:25.861 [2024-07-15 13:39:13.251653] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:25.861 [2024-07-15 13:39:13.251662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cef420 name Existed_Raid, state offline 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 32886 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 32886 ']' 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 32886 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 32886 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 32886' 00:16:25.861 killing process with pid 32886 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 32886 00:16:25.861 [2024-07-15 13:39:13.315537] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:25.861 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 32886 00:16:25.861 [2024-07-15 13:39:13.356652] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:26.119 00:16:26.119 real 0m25.031s 00:16:26.119 user 0m45.591s 00:16:26.119 sys 0m4.843s 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.119 ************************************ 00:16:26.119 END TEST raid_state_function_test 00:16:26.119 ************************************ 00:16:26.119 13:39:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:26.119 13:39:13 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:16:26.119 13:39:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:26.119 13:39:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:26.119 13:39:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:26.119 ************************************ 00:16:26.119 START TEST raid_state_function_test_sb 00:16:26.119 ************************************ 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=37426 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 37426' 00:16:26.119 Process raid pid: 37426 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 37426 /var/tmp/spdk-raid.sock 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 37426 ']' 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:26.119 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:26.119 13:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:26.120 13:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:26.120 [2024-07-15 13:39:13.709988] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:16:26.120 [2024-07-15 13:39:13.710043] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:26.376 [2024-07-15 13:39:13.798483] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:26.376 [2024-07-15 13:39:13.889488] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:26.376 [2024-07-15 13:39:13.947739] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:26.376 [2024-07-15 13:39:13.947764] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:26.941 13:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:26.941 13:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:26.941 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:27.198 [2024-07-15 13:39:14.679498] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:27.199 [2024-07-15 13:39:14.679532] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:27.199 [2024-07-15 13:39:14.679540] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:27.199 [2024-07-15 13:39:14.679564] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:27.199 [2024-07-15 13:39:14.679570] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:27.199 [2024-07-15 13:39:14.679577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:27.199 [2024-07-15 13:39:14.679583] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:27.199 [2024-07-15 13:39:14.679590] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.199 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.457 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.457 "name": "Existed_Raid", 00:16:27.457 "uuid": "719b04d1-4de9-49f4-a400-f86320aae5bd", 00:16:27.457 "strip_size_kb": 64, 00:16:27.457 "state": "configuring", 00:16:27.457 "raid_level": "concat", 00:16:27.457 "superblock": true, 00:16:27.457 "num_base_bdevs": 4, 00:16:27.457 "num_base_bdevs_discovered": 0, 00:16:27.457 "num_base_bdevs_operational": 4, 00:16:27.457 "base_bdevs_list": [ 00:16:27.457 { 00:16:27.457 "name": "BaseBdev1", 00:16:27.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.457 "is_configured": false, 00:16:27.457 "data_offset": 0, 00:16:27.457 "data_size": 0 00:16:27.457 }, 00:16:27.457 { 00:16:27.457 "name": "BaseBdev2", 00:16:27.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.457 "is_configured": false, 00:16:27.457 "data_offset": 0, 00:16:27.457 "data_size": 0 00:16:27.457 }, 00:16:27.457 { 00:16:27.457 "name": "BaseBdev3", 00:16:27.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.457 "is_configured": false, 00:16:27.457 "data_offset": 0, 00:16:27.457 "data_size": 0 00:16:27.457 }, 00:16:27.457 { 00:16:27.457 "name": "BaseBdev4", 00:16:27.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.457 "is_configured": false, 00:16:27.457 "data_offset": 0, 00:16:27.457 "data_size": 0 00:16:27.457 } 00:16:27.457 ] 00:16:27.457 }' 00:16:27.457 13:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.457 13:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.023 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:28.023 [2024-07-15 13:39:15.489467] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:28.023 [2024-07-15 13:39:15.489490] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13fff70 name Existed_Raid, state configuring 00:16:28.023 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:28.281 [2024-07-15 13:39:15.661937] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:28.281 [2024-07-15 13:39:15.661959] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:28.281 [2024-07-15 13:39:15.661965] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:28.281 [2024-07-15 13:39:15.661974] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:28.281 [2024-07-15 13:39:15.662000] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:28.281 [2024-07-15 13:39:15.662008] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:28.281 [2024-07-15 13:39:15.662014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:28.282 [2024-07-15 13:39:15.662022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:28.282 [2024-07-15 13:39:15.851114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:28.282 BaseBdev1 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:28.282 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:28.540 13:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:28.798 [ 00:16:28.798 { 00:16:28.798 "name": "BaseBdev1", 00:16:28.798 "aliases": [ 00:16:28.798 "ba5e8832-6be6-4e14-93a3-2d394d87927b" 00:16:28.798 ], 00:16:28.798 "product_name": "Malloc disk", 00:16:28.798 "block_size": 512, 00:16:28.798 "num_blocks": 65536, 00:16:28.798 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:28.798 "assigned_rate_limits": { 00:16:28.798 "rw_ios_per_sec": 0, 00:16:28.798 "rw_mbytes_per_sec": 0, 00:16:28.798 "r_mbytes_per_sec": 0, 00:16:28.798 "w_mbytes_per_sec": 0 00:16:28.798 }, 00:16:28.798 "claimed": true, 00:16:28.798 "claim_type": "exclusive_write", 00:16:28.798 "zoned": false, 00:16:28.798 "supported_io_types": { 00:16:28.798 "read": true, 00:16:28.798 "write": true, 00:16:28.798 "unmap": true, 00:16:28.798 "flush": true, 00:16:28.798 "reset": true, 00:16:28.798 "nvme_admin": false, 00:16:28.798 "nvme_io": false, 00:16:28.798 "nvme_io_md": false, 00:16:28.798 "write_zeroes": true, 00:16:28.798 "zcopy": true, 00:16:28.798 "get_zone_info": false, 00:16:28.798 "zone_management": false, 00:16:28.798 "zone_append": false, 00:16:28.798 "compare": false, 00:16:28.798 "compare_and_write": false, 00:16:28.798 "abort": true, 00:16:28.798 "seek_hole": false, 00:16:28.798 "seek_data": false, 00:16:28.798 "copy": true, 00:16:28.798 "nvme_iov_md": false 00:16:28.798 }, 00:16:28.798 "memory_domains": [ 00:16:28.798 { 00:16:28.798 "dma_device_id": "system", 00:16:28.798 "dma_device_type": 1 00:16:28.798 }, 00:16:28.798 { 00:16:28.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.798 "dma_device_type": 2 00:16:28.798 } 00:16:28.798 ], 00:16:28.798 "driver_specific": {} 00:16:28.798 } 00:16:28.798 ] 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.798 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.798 "name": "Existed_Raid", 00:16:28.798 "uuid": "f5359053-34ed-4600-9429-9a053ee7a9b4", 00:16:28.798 "strip_size_kb": 64, 00:16:28.798 "state": "configuring", 00:16:28.798 "raid_level": "concat", 00:16:28.798 "superblock": true, 00:16:28.798 "num_base_bdevs": 4, 00:16:28.798 "num_base_bdevs_discovered": 1, 00:16:28.798 "num_base_bdevs_operational": 4, 00:16:28.798 "base_bdevs_list": [ 00:16:28.798 { 00:16:28.798 "name": "BaseBdev1", 00:16:28.798 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:28.798 "is_configured": true, 00:16:28.798 "data_offset": 2048, 00:16:28.798 "data_size": 63488 00:16:28.798 }, 00:16:28.798 { 00:16:28.798 "name": "BaseBdev2", 00:16:28.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.798 "is_configured": false, 00:16:28.798 "data_offset": 0, 00:16:28.798 "data_size": 0 00:16:28.798 }, 00:16:28.798 { 00:16:28.798 "name": "BaseBdev3", 00:16:28.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.798 "is_configured": false, 00:16:28.798 "data_offset": 0, 00:16:28.798 "data_size": 0 00:16:28.798 }, 00:16:28.798 { 00:16:28.798 "name": "BaseBdev4", 00:16:28.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.799 "is_configured": false, 00:16:28.799 "data_offset": 0, 00:16:28.799 "data_size": 0 00:16:28.799 } 00:16:28.799 ] 00:16:28.799 }' 00:16:28.799 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.799 13:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:29.365 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:29.623 [2024-07-15 13:39:17.046203] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:29.623 [2024-07-15 13:39:17.046237] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ff7e0 name Existed_Raid, state configuring 00:16:29.623 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:29.623 [2024-07-15 13:39:17.222689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:29.623 [2024-07-15 13:39:17.223759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:29.623 [2024-07-15 13:39:17.223785] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:29.623 [2024-07-15 13:39:17.223792] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:29.623 [2024-07-15 13:39:17.223799] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:29.623 [2024-07-15 13:39:17.223821] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:29.623 [2024-07-15 13:39:17.223828] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.883 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.883 "name": "Existed_Raid", 00:16:29.883 "uuid": "da80a348-b38e-4291-a083-d90b69cbf5b8", 00:16:29.883 "strip_size_kb": 64, 00:16:29.883 "state": "configuring", 00:16:29.883 "raid_level": "concat", 00:16:29.883 "superblock": true, 00:16:29.883 "num_base_bdevs": 4, 00:16:29.883 "num_base_bdevs_discovered": 1, 00:16:29.883 "num_base_bdevs_operational": 4, 00:16:29.883 "base_bdevs_list": [ 00:16:29.883 { 00:16:29.883 "name": "BaseBdev1", 00:16:29.883 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:29.883 "is_configured": true, 00:16:29.883 "data_offset": 2048, 00:16:29.883 "data_size": 63488 00:16:29.883 }, 00:16:29.883 { 00:16:29.883 "name": "BaseBdev2", 00:16:29.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.883 "is_configured": false, 00:16:29.883 "data_offset": 0, 00:16:29.883 "data_size": 0 00:16:29.883 }, 00:16:29.883 { 00:16:29.883 "name": "BaseBdev3", 00:16:29.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.883 "is_configured": false, 00:16:29.883 "data_offset": 0, 00:16:29.883 "data_size": 0 00:16:29.883 }, 00:16:29.883 { 00:16:29.883 "name": "BaseBdev4", 00:16:29.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.883 "is_configured": false, 00:16:29.883 "data_offset": 0, 00:16:29.883 "data_size": 0 00:16:29.884 } 00:16:29.884 ] 00:16:29.884 }' 00:16:29.884 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.884 13:39:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:30.450 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:30.450 [2024-07-15 13:39:18.051529] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:30.450 BaseBdev2 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.709 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:30.967 [ 00:16:30.967 { 00:16:30.967 "name": "BaseBdev2", 00:16:30.967 "aliases": [ 00:16:30.967 "85538e42-0ee3-4fb6-b8d8-1be327770f20" 00:16:30.967 ], 00:16:30.967 "product_name": "Malloc disk", 00:16:30.967 "block_size": 512, 00:16:30.967 "num_blocks": 65536, 00:16:30.967 "uuid": "85538e42-0ee3-4fb6-b8d8-1be327770f20", 00:16:30.967 "assigned_rate_limits": { 00:16:30.967 "rw_ios_per_sec": 0, 00:16:30.967 "rw_mbytes_per_sec": 0, 00:16:30.967 "r_mbytes_per_sec": 0, 00:16:30.967 "w_mbytes_per_sec": 0 00:16:30.967 }, 00:16:30.967 "claimed": true, 00:16:30.967 "claim_type": "exclusive_write", 00:16:30.967 "zoned": false, 00:16:30.967 "supported_io_types": { 00:16:30.967 "read": true, 00:16:30.967 "write": true, 00:16:30.967 "unmap": true, 00:16:30.967 "flush": true, 00:16:30.967 "reset": true, 00:16:30.967 "nvme_admin": false, 00:16:30.967 "nvme_io": false, 00:16:30.967 "nvme_io_md": false, 00:16:30.967 "write_zeroes": true, 00:16:30.967 "zcopy": true, 00:16:30.967 "get_zone_info": false, 00:16:30.967 "zone_management": false, 00:16:30.967 "zone_append": false, 00:16:30.967 "compare": false, 00:16:30.967 "compare_and_write": false, 00:16:30.967 "abort": true, 00:16:30.967 "seek_hole": false, 00:16:30.967 "seek_data": false, 00:16:30.967 "copy": true, 00:16:30.967 "nvme_iov_md": false 00:16:30.967 }, 00:16:30.967 "memory_domains": [ 00:16:30.967 { 00:16:30.967 "dma_device_id": "system", 00:16:30.967 "dma_device_type": 1 00:16:30.967 }, 00:16:30.967 { 00:16:30.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.967 "dma_device_type": 2 00:16:30.967 } 00:16:30.967 ], 00:16:30.967 "driver_specific": {} 00:16:30.967 } 00:16:30.967 ] 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.967 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.226 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.226 "name": "Existed_Raid", 00:16:31.226 "uuid": "da80a348-b38e-4291-a083-d90b69cbf5b8", 00:16:31.226 "strip_size_kb": 64, 00:16:31.226 "state": "configuring", 00:16:31.226 "raid_level": "concat", 00:16:31.226 "superblock": true, 00:16:31.226 "num_base_bdevs": 4, 00:16:31.226 "num_base_bdevs_discovered": 2, 00:16:31.226 "num_base_bdevs_operational": 4, 00:16:31.226 "base_bdevs_list": [ 00:16:31.226 { 00:16:31.226 "name": "BaseBdev1", 00:16:31.226 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:31.226 "is_configured": true, 00:16:31.226 "data_offset": 2048, 00:16:31.226 "data_size": 63488 00:16:31.226 }, 00:16:31.226 { 00:16:31.226 "name": "BaseBdev2", 00:16:31.226 "uuid": "85538e42-0ee3-4fb6-b8d8-1be327770f20", 00:16:31.226 "is_configured": true, 00:16:31.226 "data_offset": 2048, 00:16:31.226 "data_size": 63488 00:16:31.226 }, 00:16:31.226 { 00:16:31.226 "name": "BaseBdev3", 00:16:31.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.226 "is_configured": false, 00:16:31.226 "data_offset": 0, 00:16:31.226 "data_size": 0 00:16:31.226 }, 00:16:31.226 { 00:16:31.226 "name": "BaseBdev4", 00:16:31.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.226 "is_configured": false, 00:16:31.226 "data_offset": 0, 00:16:31.226 "data_size": 0 00:16:31.226 } 00:16:31.226 ] 00:16:31.226 }' 00:16:31.226 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.226 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.792 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:31.793 [2024-07-15 13:39:19.277534] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:31.793 BaseBdev3 00:16:31.793 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:31.793 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:31.793 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:31.793 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:31.793 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:31.793 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:31.793 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:32.051 [ 00:16:32.051 { 00:16:32.051 "name": "BaseBdev3", 00:16:32.051 "aliases": [ 00:16:32.051 "c88e0a12-75e0-4574-ab17-bf073d66e8c3" 00:16:32.051 ], 00:16:32.051 "product_name": "Malloc disk", 00:16:32.051 "block_size": 512, 00:16:32.051 "num_blocks": 65536, 00:16:32.051 "uuid": "c88e0a12-75e0-4574-ab17-bf073d66e8c3", 00:16:32.051 "assigned_rate_limits": { 00:16:32.051 "rw_ios_per_sec": 0, 00:16:32.051 "rw_mbytes_per_sec": 0, 00:16:32.051 "r_mbytes_per_sec": 0, 00:16:32.051 "w_mbytes_per_sec": 0 00:16:32.051 }, 00:16:32.051 "claimed": true, 00:16:32.051 "claim_type": "exclusive_write", 00:16:32.051 "zoned": false, 00:16:32.051 "supported_io_types": { 00:16:32.051 "read": true, 00:16:32.051 "write": true, 00:16:32.051 "unmap": true, 00:16:32.051 "flush": true, 00:16:32.051 "reset": true, 00:16:32.051 "nvme_admin": false, 00:16:32.051 "nvme_io": false, 00:16:32.051 "nvme_io_md": false, 00:16:32.051 "write_zeroes": true, 00:16:32.051 "zcopy": true, 00:16:32.051 "get_zone_info": false, 00:16:32.051 "zone_management": false, 00:16:32.051 "zone_append": false, 00:16:32.051 "compare": false, 00:16:32.051 "compare_and_write": false, 00:16:32.051 "abort": true, 00:16:32.051 "seek_hole": false, 00:16:32.051 "seek_data": false, 00:16:32.051 "copy": true, 00:16:32.051 "nvme_iov_md": false 00:16:32.051 }, 00:16:32.051 "memory_domains": [ 00:16:32.051 { 00:16:32.051 "dma_device_id": "system", 00:16:32.051 "dma_device_type": 1 00:16:32.051 }, 00:16:32.051 { 00:16:32.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.051 "dma_device_type": 2 00:16:32.051 } 00:16:32.051 ], 00:16:32.051 "driver_specific": {} 00:16:32.051 } 00:16:32.051 ] 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.051 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.309 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.309 "name": "Existed_Raid", 00:16:32.309 "uuid": "da80a348-b38e-4291-a083-d90b69cbf5b8", 00:16:32.309 "strip_size_kb": 64, 00:16:32.309 "state": "configuring", 00:16:32.309 "raid_level": "concat", 00:16:32.309 "superblock": true, 00:16:32.309 "num_base_bdevs": 4, 00:16:32.309 "num_base_bdevs_discovered": 3, 00:16:32.309 "num_base_bdevs_operational": 4, 00:16:32.309 "base_bdevs_list": [ 00:16:32.309 { 00:16:32.309 "name": "BaseBdev1", 00:16:32.309 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:32.309 "is_configured": true, 00:16:32.309 "data_offset": 2048, 00:16:32.309 "data_size": 63488 00:16:32.309 }, 00:16:32.309 { 00:16:32.309 "name": "BaseBdev2", 00:16:32.309 "uuid": "85538e42-0ee3-4fb6-b8d8-1be327770f20", 00:16:32.309 "is_configured": true, 00:16:32.309 "data_offset": 2048, 00:16:32.309 "data_size": 63488 00:16:32.309 }, 00:16:32.309 { 00:16:32.309 "name": "BaseBdev3", 00:16:32.309 "uuid": "c88e0a12-75e0-4574-ab17-bf073d66e8c3", 00:16:32.309 "is_configured": true, 00:16:32.309 "data_offset": 2048, 00:16:32.309 "data_size": 63488 00:16:32.309 }, 00:16:32.309 { 00:16:32.309 "name": "BaseBdev4", 00:16:32.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.309 "is_configured": false, 00:16:32.309 "data_offset": 0, 00:16:32.309 "data_size": 0 00:16:32.309 } 00:16:32.309 ] 00:16:32.309 }' 00:16:32.309 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.310 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:32.876 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:32.876 [2024-07-15 13:39:20.479583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:32.876 [2024-07-15 13:39:20.479731] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1400840 00:16:32.876 [2024-07-15 13:39:20.479741] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:32.876 [2024-07-15 13:39:20.479870] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1400480 00:16:32.876 [2024-07-15 13:39:20.479961] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1400840 00:16:32.876 [2024-07-15 13:39:20.479968] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1400840 00:16:32.876 [2024-07-15 13:39:20.480046] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:32.876 BaseBdev4 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:33.135 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:33.393 [ 00:16:33.393 { 00:16:33.393 "name": "BaseBdev4", 00:16:33.393 "aliases": [ 00:16:33.393 "39600f47-f107-432f-b1c0-713ff4b2ca44" 00:16:33.393 ], 00:16:33.393 "product_name": "Malloc disk", 00:16:33.393 "block_size": 512, 00:16:33.393 "num_blocks": 65536, 00:16:33.393 "uuid": "39600f47-f107-432f-b1c0-713ff4b2ca44", 00:16:33.393 "assigned_rate_limits": { 00:16:33.393 "rw_ios_per_sec": 0, 00:16:33.393 "rw_mbytes_per_sec": 0, 00:16:33.393 "r_mbytes_per_sec": 0, 00:16:33.393 "w_mbytes_per_sec": 0 00:16:33.393 }, 00:16:33.393 "claimed": true, 00:16:33.393 "claim_type": "exclusive_write", 00:16:33.393 "zoned": false, 00:16:33.393 "supported_io_types": { 00:16:33.393 "read": true, 00:16:33.393 "write": true, 00:16:33.393 "unmap": true, 00:16:33.393 "flush": true, 00:16:33.393 "reset": true, 00:16:33.393 "nvme_admin": false, 00:16:33.393 "nvme_io": false, 00:16:33.393 "nvme_io_md": false, 00:16:33.393 "write_zeroes": true, 00:16:33.393 "zcopy": true, 00:16:33.393 "get_zone_info": false, 00:16:33.393 "zone_management": false, 00:16:33.393 "zone_append": false, 00:16:33.393 "compare": false, 00:16:33.393 "compare_and_write": false, 00:16:33.393 "abort": true, 00:16:33.393 "seek_hole": false, 00:16:33.393 "seek_data": false, 00:16:33.393 "copy": true, 00:16:33.393 "nvme_iov_md": false 00:16:33.393 }, 00:16:33.393 "memory_domains": [ 00:16:33.393 { 00:16:33.393 "dma_device_id": "system", 00:16:33.393 "dma_device_type": 1 00:16:33.393 }, 00:16:33.393 { 00:16:33.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.393 "dma_device_type": 2 00:16:33.393 } 00:16:33.393 ], 00:16:33.393 "driver_specific": {} 00:16:33.393 } 00:16:33.393 ] 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.393 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.652 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.652 "name": "Existed_Raid", 00:16:33.652 "uuid": "da80a348-b38e-4291-a083-d90b69cbf5b8", 00:16:33.652 "strip_size_kb": 64, 00:16:33.652 "state": "online", 00:16:33.652 "raid_level": "concat", 00:16:33.652 "superblock": true, 00:16:33.652 "num_base_bdevs": 4, 00:16:33.652 "num_base_bdevs_discovered": 4, 00:16:33.652 "num_base_bdevs_operational": 4, 00:16:33.652 "base_bdevs_list": [ 00:16:33.652 { 00:16:33.652 "name": "BaseBdev1", 00:16:33.652 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:33.652 "is_configured": true, 00:16:33.652 "data_offset": 2048, 00:16:33.652 "data_size": 63488 00:16:33.652 }, 00:16:33.652 { 00:16:33.652 "name": "BaseBdev2", 00:16:33.652 "uuid": "85538e42-0ee3-4fb6-b8d8-1be327770f20", 00:16:33.652 "is_configured": true, 00:16:33.652 "data_offset": 2048, 00:16:33.652 "data_size": 63488 00:16:33.652 }, 00:16:33.652 { 00:16:33.652 "name": "BaseBdev3", 00:16:33.652 "uuid": "c88e0a12-75e0-4574-ab17-bf073d66e8c3", 00:16:33.652 "is_configured": true, 00:16:33.652 "data_offset": 2048, 00:16:33.652 "data_size": 63488 00:16:33.652 }, 00:16:33.652 { 00:16:33.652 "name": "BaseBdev4", 00:16:33.652 "uuid": "39600f47-f107-432f-b1c0-713ff4b2ca44", 00:16:33.652 "is_configured": true, 00:16:33.652 "data_offset": 2048, 00:16:33.652 "data_size": 63488 00:16:33.652 } 00:16:33.652 ] 00:16:33.652 }' 00:16:33.652 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.652 13:39:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:33.911 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:34.169 [2024-07-15 13:39:21.658846] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:34.169 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:34.169 "name": "Existed_Raid", 00:16:34.169 "aliases": [ 00:16:34.169 "da80a348-b38e-4291-a083-d90b69cbf5b8" 00:16:34.169 ], 00:16:34.169 "product_name": "Raid Volume", 00:16:34.169 "block_size": 512, 00:16:34.169 "num_blocks": 253952, 00:16:34.169 "uuid": "da80a348-b38e-4291-a083-d90b69cbf5b8", 00:16:34.169 "assigned_rate_limits": { 00:16:34.169 "rw_ios_per_sec": 0, 00:16:34.169 "rw_mbytes_per_sec": 0, 00:16:34.169 "r_mbytes_per_sec": 0, 00:16:34.169 "w_mbytes_per_sec": 0 00:16:34.169 }, 00:16:34.169 "claimed": false, 00:16:34.169 "zoned": false, 00:16:34.169 "supported_io_types": { 00:16:34.169 "read": true, 00:16:34.169 "write": true, 00:16:34.169 "unmap": true, 00:16:34.169 "flush": true, 00:16:34.169 "reset": true, 00:16:34.169 "nvme_admin": false, 00:16:34.169 "nvme_io": false, 00:16:34.169 "nvme_io_md": false, 00:16:34.169 "write_zeroes": true, 00:16:34.169 "zcopy": false, 00:16:34.169 "get_zone_info": false, 00:16:34.169 "zone_management": false, 00:16:34.169 "zone_append": false, 00:16:34.169 "compare": false, 00:16:34.169 "compare_and_write": false, 00:16:34.169 "abort": false, 00:16:34.169 "seek_hole": false, 00:16:34.169 "seek_data": false, 00:16:34.169 "copy": false, 00:16:34.169 "nvme_iov_md": false 00:16:34.169 }, 00:16:34.169 "memory_domains": [ 00:16:34.169 { 00:16:34.169 "dma_device_id": "system", 00:16:34.169 "dma_device_type": 1 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.169 "dma_device_type": 2 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "dma_device_id": "system", 00:16:34.169 "dma_device_type": 1 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.169 "dma_device_type": 2 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "dma_device_id": "system", 00:16:34.169 "dma_device_type": 1 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.169 "dma_device_type": 2 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "dma_device_id": "system", 00:16:34.169 "dma_device_type": 1 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.169 "dma_device_type": 2 00:16:34.169 } 00:16:34.169 ], 00:16:34.169 "driver_specific": { 00:16:34.169 "raid": { 00:16:34.169 "uuid": "da80a348-b38e-4291-a083-d90b69cbf5b8", 00:16:34.169 "strip_size_kb": 64, 00:16:34.169 "state": "online", 00:16:34.169 "raid_level": "concat", 00:16:34.169 "superblock": true, 00:16:34.169 "num_base_bdevs": 4, 00:16:34.169 "num_base_bdevs_discovered": 4, 00:16:34.169 "num_base_bdevs_operational": 4, 00:16:34.169 "base_bdevs_list": [ 00:16:34.169 { 00:16:34.169 "name": "BaseBdev1", 00:16:34.169 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:34.169 "is_configured": true, 00:16:34.169 "data_offset": 2048, 00:16:34.169 "data_size": 63488 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "name": "BaseBdev2", 00:16:34.169 "uuid": "85538e42-0ee3-4fb6-b8d8-1be327770f20", 00:16:34.169 "is_configured": true, 00:16:34.169 "data_offset": 2048, 00:16:34.169 "data_size": 63488 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "name": "BaseBdev3", 00:16:34.169 "uuid": "c88e0a12-75e0-4574-ab17-bf073d66e8c3", 00:16:34.169 "is_configured": true, 00:16:34.169 "data_offset": 2048, 00:16:34.169 "data_size": 63488 00:16:34.169 }, 00:16:34.169 { 00:16:34.169 "name": "BaseBdev4", 00:16:34.169 "uuid": "39600f47-f107-432f-b1c0-713ff4b2ca44", 00:16:34.169 "is_configured": true, 00:16:34.169 "data_offset": 2048, 00:16:34.169 "data_size": 63488 00:16:34.169 } 00:16:34.169 ] 00:16:34.169 } 00:16:34.169 } 00:16:34.169 }' 00:16:34.169 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:34.169 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:34.169 BaseBdev2 00:16:34.169 BaseBdev3 00:16:34.169 BaseBdev4' 00:16:34.169 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:34.169 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:34.169 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:34.427 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:34.427 "name": "BaseBdev1", 00:16:34.427 "aliases": [ 00:16:34.427 "ba5e8832-6be6-4e14-93a3-2d394d87927b" 00:16:34.427 ], 00:16:34.427 "product_name": "Malloc disk", 00:16:34.427 "block_size": 512, 00:16:34.427 "num_blocks": 65536, 00:16:34.427 "uuid": "ba5e8832-6be6-4e14-93a3-2d394d87927b", 00:16:34.427 "assigned_rate_limits": { 00:16:34.427 "rw_ios_per_sec": 0, 00:16:34.427 "rw_mbytes_per_sec": 0, 00:16:34.427 "r_mbytes_per_sec": 0, 00:16:34.427 "w_mbytes_per_sec": 0 00:16:34.427 }, 00:16:34.427 "claimed": true, 00:16:34.427 "claim_type": "exclusive_write", 00:16:34.427 "zoned": false, 00:16:34.427 "supported_io_types": { 00:16:34.427 "read": true, 00:16:34.427 "write": true, 00:16:34.427 "unmap": true, 00:16:34.427 "flush": true, 00:16:34.427 "reset": true, 00:16:34.427 "nvme_admin": false, 00:16:34.427 "nvme_io": false, 00:16:34.427 "nvme_io_md": false, 00:16:34.427 "write_zeroes": true, 00:16:34.427 "zcopy": true, 00:16:34.427 "get_zone_info": false, 00:16:34.427 "zone_management": false, 00:16:34.427 "zone_append": false, 00:16:34.427 "compare": false, 00:16:34.427 "compare_and_write": false, 00:16:34.427 "abort": true, 00:16:34.427 "seek_hole": false, 00:16:34.427 "seek_data": false, 00:16:34.427 "copy": true, 00:16:34.427 "nvme_iov_md": false 00:16:34.427 }, 00:16:34.427 "memory_domains": [ 00:16:34.427 { 00:16:34.427 "dma_device_id": "system", 00:16:34.427 "dma_device_type": 1 00:16:34.427 }, 00:16:34.427 { 00:16:34.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.427 "dma_device_type": 2 00:16:34.427 } 00:16:34.427 ], 00:16:34.427 "driver_specific": {} 00:16:34.427 }' 00:16:34.427 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.427 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.427 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:34.427 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.427 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:34.686 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:34.943 "name": "BaseBdev2", 00:16:34.943 "aliases": [ 00:16:34.943 "85538e42-0ee3-4fb6-b8d8-1be327770f20" 00:16:34.943 ], 00:16:34.943 "product_name": "Malloc disk", 00:16:34.943 "block_size": 512, 00:16:34.943 "num_blocks": 65536, 00:16:34.943 "uuid": "85538e42-0ee3-4fb6-b8d8-1be327770f20", 00:16:34.943 "assigned_rate_limits": { 00:16:34.943 "rw_ios_per_sec": 0, 00:16:34.943 "rw_mbytes_per_sec": 0, 00:16:34.943 "r_mbytes_per_sec": 0, 00:16:34.943 "w_mbytes_per_sec": 0 00:16:34.943 }, 00:16:34.943 "claimed": true, 00:16:34.943 "claim_type": "exclusive_write", 00:16:34.943 "zoned": false, 00:16:34.943 "supported_io_types": { 00:16:34.943 "read": true, 00:16:34.943 "write": true, 00:16:34.943 "unmap": true, 00:16:34.943 "flush": true, 00:16:34.943 "reset": true, 00:16:34.943 "nvme_admin": false, 00:16:34.943 "nvme_io": false, 00:16:34.943 "nvme_io_md": false, 00:16:34.943 "write_zeroes": true, 00:16:34.943 "zcopy": true, 00:16:34.943 "get_zone_info": false, 00:16:34.943 "zone_management": false, 00:16:34.943 "zone_append": false, 00:16:34.943 "compare": false, 00:16:34.943 "compare_and_write": false, 00:16:34.943 "abort": true, 00:16:34.943 "seek_hole": false, 00:16:34.943 "seek_data": false, 00:16:34.943 "copy": true, 00:16:34.943 "nvme_iov_md": false 00:16:34.943 }, 00:16:34.943 "memory_domains": [ 00:16:34.943 { 00:16:34.943 "dma_device_id": "system", 00:16:34.943 "dma_device_type": 1 00:16:34.943 }, 00:16:34.943 { 00:16:34.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.943 "dma_device_type": 2 00:16:34.943 } 00:16:34.943 ], 00:16:34.943 "driver_specific": {} 00:16:34.943 }' 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.943 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:35.201 "name": "BaseBdev3", 00:16:35.201 "aliases": [ 00:16:35.201 "c88e0a12-75e0-4574-ab17-bf073d66e8c3" 00:16:35.201 ], 00:16:35.201 "product_name": "Malloc disk", 00:16:35.201 "block_size": 512, 00:16:35.201 "num_blocks": 65536, 00:16:35.201 "uuid": "c88e0a12-75e0-4574-ab17-bf073d66e8c3", 00:16:35.201 "assigned_rate_limits": { 00:16:35.201 "rw_ios_per_sec": 0, 00:16:35.201 "rw_mbytes_per_sec": 0, 00:16:35.201 "r_mbytes_per_sec": 0, 00:16:35.201 "w_mbytes_per_sec": 0 00:16:35.201 }, 00:16:35.201 "claimed": true, 00:16:35.201 "claim_type": "exclusive_write", 00:16:35.201 "zoned": false, 00:16:35.201 "supported_io_types": { 00:16:35.201 "read": true, 00:16:35.201 "write": true, 00:16:35.201 "unmap": true, 00:16:35.201 "flush": true, 00:16:35.201 "reset": true, 00:16:35.201 "nvme_admin": false, 00:16:35.201 "nvme_io": false, 00:16:35.201 "nvme_io_md": false, 00:16:35.201 "write_zeroes": true, 00:16:35.201 "zcopy": true, 00:16:35.201 "get_zone_info": false, 00:16:35.201 "zone_management": false, 00:16:35.201 "zone_append": false, 00:16:35.201 "compare": false, 00:16:35.201 "compare_and_write": false, 00:16:35.201 "abort": true, 00:16:35.201 "seek_hole": false, 00:16:35.201 "seek_data": false, 00:16:35.201 "copy": true, 00:16:35.201 "nvme_iov_md": false 00:16:35.201 }, 00:16:35.201 "memory_domains": [ 00:16:35.201 { 00:16:35.201 "dma_device_id": "system", 00:16:35.201 "dma_device_type": 1 00:16:35.201 }, 00:16:35.201 { 00:16:35.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.201 "dma_device_type": 2 00:16:35.201 } 00:16:35.201 ], 00:16:35.201 "driver_specific": {} 00:16:35.201 }' 00:16:35.201 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.458 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.458 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:35.458 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.458 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.458 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:35.458 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.458 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.458 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:35.458 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.715 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.715 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:35.715 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.715 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:35.715 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:35.715 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:35.715 "name": "BaseBdev4", 00:16:35.715 "aliases": [ 00:16:35.715 "39600f47-f107-432f-b1c0-713ff4b2ca44" 00:16:35.715 ], 00:16:35.715 "product_name": "Malloc disk", 00:16:35.715 "block_size": 512, 00:16:35.715 "num_blocks": 65536, 00:16:35.715 "uuid": "39600f47-f107-432f-b1c0-713ff4b2ca44", 00:16:35.715 "assigned_rate_limits": { 00:16:35.715 "rw_ios_per_sec": 0, 00:16:35.715 "rw_mbytes_per_sec": 0, 00:16:35.715 "r_mbytes_per_sec": 0, 00:16:35.715 "w_mbytes_per_sec": 0 00:16:35.715 }, 00:16:35.715 "claimed": true, 00:16:35.715 "claim_type": "exclusive_write", 00:16:35.715 "zoned": false, 00:16:35.715 "supported_io_types": { 00:16:35.715 "read": true, 00:16:35.715 "write": true, 00:16:35.715 "unmap": true, 00:16:35.715 "flush": true, 00:16:35.715 "reset": true, 00:16:35.715 "nvme_admin": false, 00:16:35.715 "nvme_io": false, 00:16:35.715 "nvme_io_md": false, 00:16:35.715 "write_zeroes": true, 00:16:35.715 "zcopy": true, 00:16:35.715 "get_zone_info": false, 00:16:35.715 "zone_management": false, 00:16:35.715 "zone_append": false, 00:16:35.715 "compare": false, 00:16:35.715 "compare_and_write": false, 00:16:35.715 "abort": true, 00:16:35.715 "seek_hole": false, 00:16:35.715 "seek_data": false, 00:16:35.715 "copy": true, 00:16:35.715 "nvme_iov_md": false 00:16:35.715 }, 00:16:35.716 "memory_domains": [ 00:16:35.716 { 00:16:35.716 "dma_device_id": "system", 00:16:35.716 "dma_device_type": 1 00:16:35.716 }, 00:16:35.716 { 00:16:35.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.716 "dma_device_type": 2 00:16:35.716 } 00:16:35.716 ], 00:16:35.716 "driver_specific": {} 00:16:35.716 }' 00:16:35.716 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:35.973 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:36.231 [2024-07-15 13:39:23.752084] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:36.231 [2024-07-15 13:39:23.752105] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:36.231 [2024-07-15 13:39:23.752137] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:36.231 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:36.231 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:36.231 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:36.231 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:36.231 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:36.231 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:16:36.231 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.232 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.489 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.489 "name": "Existed_Raid", 00:16:36.489 "uuid": "da80a348-b38e-4291-a083-d90b69cbf5b8", 00:16:36.489 "strip_size_kb": 64, 00:16:36.489 "state": "offline", 00:16:36.489 "raid_level": "concat", 00:16:36.489 "superblock": true, 00:16:36.489 "num_base_bdevs": 4, 00:16:36.489 "num_base_bdevs_discovered": 3, 00:16:36.489 "num_base_bdevs_operational": 3, 00:16:36.489 "base_bdevs_list": [ 00:16:36.489 { 00:16:36.489 "name": null, 00:16:36.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.489 "is_configured": false, 00:16:36.489 "data_offset": 2048, 00:16:36.489 "data_size": 63488 00:16:36.489 }, 00:16:36.489 { 00:16:36.489 "name": "BaseBdev2", 00:16:36.489 "uuid": "85538e42-0ee3-4fb6-b8d8-1be327770f20", 00:16:36.489 "is_configured": true, 00:16:36.489 "data_offset": 2048, 00:16:36.489 "data_size": 63488 00:16:36.489 }, 00:16:36.489 { 00:16:36.489 "name": "BaseBdev3", 00:16:36.489 "uuid": "c88e0a12-75e0-4574-ab17-bf073d66e8c3", 00:16:36.489 "is_configured": true, 00:16:36.489 "data_offset": 2048, 00:16:36.489 "data_size": 63488 00:16:36.489 }, 00:16:36.489 { 00:16:36.489 "name": "BaseBdev4", 00:16:36.489 "uuid": "39600f47-f107-432f-b1c0-713ff4b2ca44", 00:16:36.489 "is_configured": true, 00:16:36.489 "data_offset": 2048, 00:16:36.489 "data_size": 63488 00:16:36.489 } 00:16:36.489 ] 00:16:36.489 }' 00:16:36.489 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.489 13:39:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:37.054 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:37.054 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:37.054 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.054 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:37.054 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:37.054 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:37.054 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:37.311 [2024-07-15 13:39:24.791581] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:37.311 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:37.311 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:37.311 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.311 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:37.568 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:37.568 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:37.568 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:37.568 [2024-07-15 13:39:25.160417] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:37.825 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:37.825 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:37.825 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.825 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:37.825 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:37.825 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:37.825 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:38.082 [2024-07-15 13:39:25.509159] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:38.082 [2024-07-15 13:39:25.509192] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1400840 name Existed_Raid, state offline 00:16:38.082 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:38.082 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.082 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.082 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:38.340 BaseBdev2 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:38.340 13:39:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.598 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:38.598 [ 00:16:38.598 { 00:16:38.598 "name": "BaseBdev2", 00:16:38.598 "aliases": [ 00:16:38.598 "58687cbe-6b38-47d7-af2f-68ac22c3a8c0" 00:16:38.598 ], 00:16:38.598 "product_name": "Malloc disk", 00:16:38.598 "block_size": 512, 00:16:38.598 "num_blocks": 65536, 00:16:38.598 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:38.598 "assigned_rate_limits": { 00:16:38.598 "rw_ios_per_sec": 0, 00:16:38.598 "rw_mbytes_per_sec": 0, 00:16:38.598 "r_mbytes_per_sec": 0, 00:16:38.598 "w_mbytes_per_sec": 0 00:16:38.598 }, 00:16:38.598 "claimed": false, 00:16:38.598 "zoned": false, 00:16:38.598 "supported_io_types": { 00:16:38.598 "read": true, 00:16:38.598 "write": true, 00:16:38.598 "unmap": true, 00:16:38.598 "flush": true, 00:16:38.598 "reset": true, 00:16:38.598 "nvme_admin": false, 00:16:38.598 "nvme_io": false, 00:16:38.598 "nvme_io_md": false, 00:16:38.598 "write_zeroes": true, 00:16:38.598 "zcopy": true, 00:16:38.598 "get_zone_info": false, 00:16:38.598 "zone_management": false, 00:16:38.598 "zone_append": false, 00:16:38.598 "compare": false, 00:16:38.598 "compare_and_write": false, 00:16:38.598 "abort": true, 00:16:38.598 "seek_hole": false, 00:16:38.598 "seek_data": false, 00:16:38.598 "copy": true, 00:16:38.599 "nvme_iov_md": false 00:16:38.599 }, 00:16:38.599 "memory_domains": [ 00:16:38.599 { 00:16:38.599 "dma_device_id": "system", 00:16:38.599 "dma_device_type": 1 00:16:38.599 }, 00:16:38.599 { 00:16:38.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.599 "dma_device_type": 2 00:16:38.599 } 00:16:38.599 ], 00:16:38.599 "driver_specific": {} 00:16:38.599 } 00:16:38.599 ] 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:38.856 BaseBdev3 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:38.856 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:39.113 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:39.371 [ 00:16:39.371 { 00:16:39.371 "name": "BaseBdev3", 00:16:39.371 "aliases": [ 00:16:39.371 "0ae9b374-6ca9-475b-8a8d-c165e57844a0" 00:16:39.371 ], 00:16:39.371 "product_name": "Malloc disk", 00:16:39.371 "block_size": 512, 00:16:39.371 "num_blocks": 65536, 00:16:39.371 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:39.371 "assigned_rate_limits": { 00:16:39.371 "rw_ios_per_sec": 0, 00:16:39.371 "rw_mbytes_per_sec": 0, 00:16:39.371 "r_mbytes_per_sec": 0, 00:16:39.371 "w_mbytes_per_sec": 0 00:16:39.371 }, 00:16:39.371 "claimed": false, 00:16:39.371 "zoned": false, 00:16:39.371 "supported_io_types": { 00:16:39.371 "read": true, 00:16:39.371 "write": true, 00:16:39.371 "unmap": true, 00:16:39.371 "flush": true, 00:16:39.371 "reset": true, 00:16:39.371 "nvme_admin": false, 00:16:39.371 "nvme_io": false, 00:16:39.371 "nvme_io_md": false, 00:16:39.371 "write_zeroes": true, 00:16:39.371 "zcopy": true, 00:16:39.371 "get_zone_info": false, 00:16:39.371 "zone_management": false, 00:16:39.371 "zone_append": false, 00:16:39.371 "compare": false, 00:16:39.371 "compare_and_write": false, 00:16:39.371 "abort": true, 00:16:39.371 "seek_hole": false, 00:16:39.371 "seek_data": false, 00:16:39.371 "copy": true, 00:16:39.371 "nvme_iov_md": false 00:16:39.371 }, 00:16:39.371 "memory_domains": [ 00:16:39.371 { 00:16:39.371 "dma_device_id": "system", 00:16:39.371 "dma_device_type": 1 00:16:39.371 }, 00:16:39.371 { 00:16:39.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.371 "dma_device_type": 2 00:16:39.371 } 00:16:39.371 ], 00:16:39.371 "driver_specific": {} 00:16:39.371 } 00:16:39.371 ] 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:39.371 BaseBdev4 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:39.371 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:39.372 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:39.372 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:39.372 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:39.629 13:39:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:39.899 [ 00:16:39.899 { 00:16:39.899 "name": "BaseBdev4", 00:16:39.899 "aliases": [ 00:16:39.899 "3655f609-f91b-4d9e-92fe-3c743aa54701" 00:16:39.899 ], 00:16:39.899 "product_name": "Malloc disk", 00:16:39.899 "block_size": 512, 00:16:39.899 "num_blocks": 65536, 00:16:39.899 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:39.899 "assigned_rate_limits": { 00:16:39.899 "rw_ios_per_sec": 0, 00:16:39.899 "rw_mbytes_per_sec": 0, 00:16:39.899 "r_mbytes_per_sec": 0, 00:16:39.899 "w_mbytes_per_sec": 0 00:16:39.899 }, 00:16:39.899 "claimed": false, 00:16:39.899 "zoned": false, 00:16:39.899 "supported_io_types": { 00:16:39.899 "read": true, 00:16:39.899 "write": true, 00:16:39.899 "unmap": true, 00:16:39.899 "flush": true, 00:16:39.899 "reset": true, 00:16:39.899 "nvme_admin": false, 00:16:39.899 "nvme_io": false, 00:16:39.899 "nvme_io_md": false, 00:16:39.899 "write_zeroes": true, 00:16:39.899 "zcopy": true, 00:16:39.899 "get_zone_info": false, 00:16:39.899 "zone_management": false, 00:16:39.899 "zone_append": false, 00:16:39.899 "compare": false, 00:16:39.899 "compare_and_write": false, 00:16:39.899 "abort": true, 00:16:39.899 "seek_hole": false, 00:16:39.899 "seek_data": false, 00:16:39.899 "copy": true, 00:16:39.899 "nvme_iov_md": false 00:16:39.899 }, 00:16:39.899 "memory_domains": [ 00:16:39.899 { 00:16:39.899 "dma_device_id": "system", 00:16:39.899 "dma_device_type": 1 00:16:39.899 }, 00:16:39.899 { 00:16:39.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.899 "dma_device_type": 2 00:16:39.899 } 00:16:39.899 ], 00:16:39.899 "driver_specific": {} 00:16:39.899 } 00:16:39.899 ] 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:39.899 [2024-07-15 13:39:27.406277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:39.899 [2024-07-15 13:39:27.406308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:39.899 [2024-07-15 13:39:27.406321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:39.899 [2024-07-15 13:39:27.407290] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:39.899 [2024-07-15 13:39:27.407319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.899 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.225 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.225 "name": "Existed_Raid", 00:16:40.225 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:40.225 "strip_size_kb": 64, 00:16:40.225 "state": "configuring", 00:16:40.225 "raid_level": "concat", 00:16:40.225 "superblock": true, 00:16:40.225 "num_base_bdevs": 4, 00:16:40.225 "num_base_bdevs_discovered": 3, 00:16:40.225 "num_base_bdevs_operational": 4, 00:16:40.225 "base_bdevs_list": [ 00:16:40.225 { 00:16:40.225 "name": "BaseBdev1", 00:16:40.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.225 "is_configured": false, 00:16:40.225 "data_offset": 0, 00:16:40.225 "data_size": 0 00:16:40.225 }, 00:16:40.225 { 00:16:40.225 "name": "BaseBdev2", 00:16:40.225 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:40.225 "is_configured": true, 00:16:40.225 "data_offset": 2048, 00:16:40.225 "data_size": 63488 00:16:40.225 }, 00:16:40.225 { 00:16:40.225 "name": "BaseBdev3", 00:16:40.225 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:40.225 "is_configured": true, 00:16:40.225 "data_offset": 2048, 00:16:40.225 "data_size": 63488 00:16:40.225 }, 00:16:40.225 { 00:16:40.225 "name": "BaseBdev4", 00:16:40.225 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:40.225 "is_configured": true, 00:16:40.225 "data_offset": 2048, 00:16:40.225 "data_size": 63488 00:16:40.225 } 00:16:40.225 ] 00:16:40.225 }' 00:16:40.225 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.225 13:39:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:40.791 [2024-07-15 13:39:28.268645] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.791 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.050 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.050 "name": "Existed_Raid", 00:16:41.050 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:41.050 "strip_size_kb": 64, 00:16:41.050 "state": "configuring", 00:16:41.050 "raid_level": "concat", 00:16:41.050 "superblock": true, 00:16:41.050 "num_base_bdevs": 4, 00:16:41.050 "num_base_bdevs_discovered": 2, 00:16:41.050 "num_base_bdevs_operational": 4, 00:16:41.050 "base_bdevs_list": [ 00:16:41.050 { 00:16:41.050 "name": "BaseBdev1", 00:16:41.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.050 "is_configured": false, 00:16:41.050 "data_offset": 0, 00:16:41.050 "data_size": 0 00:16:41.050 }, 00:16:41.050 { 00:16:41.050 "name": null, 00:16:41.050 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:41.050 "is_configured": false, 00:16:41.050 "data_offset": 2048, 00:16:41.050 "data_size": 63488 00:16:41.050 }, 00:16:41.050 { 00:16:41.050 "name": "BaseBdev3", 00:16:41.050 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:41.050 "is_configured": true, 00:16:41.050 "data_offset": 2048, 00:16:41.050 "data_size": 63488 00:16:41.050 }, 00:16:41.050 { 00:16:41.050 "name": "BaseBdev4", 00:16:41.050 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:41.050 "is_configured": true, 00:16:41.050 "data_offset": 2048, 00:16:41.050 "data_size": 63488 00:16:41.050 } 00:16:41.050 ] 00:16:41.050 }' 00:16:41.050 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.050 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:41.615 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:41.615 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.615 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:41.615 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:41.873 [2024-07-15 13:39:29.303409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:41.873 BaseBdev1 00:16:41.873 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:41.873 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:41.873 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:41.873 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:41.873 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:41.873 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:41.873 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:42.132 [ 00:16:42.132 { 00:16:42.132 "name": "BaseBdev1", 00:16:42.132 "aliases": [ 00:16:42.132 "e37c4776-36c1-49a3-b04b-82d87324cd8a" 00:16:42.132 ], 00:16:42.132 "product_name": "Malloc disk", 00:16:42.132 "block_size": 512, 00:16:42.132 "num_blocks": 65536, 00:16:42.132 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:42.132 "assigned_rate_limits": { 00:16:42.132 "rw_ios_per_sec": 0, 00:16:42.132 "rw_mbytes_per_sec": 0, 00:16:42.132 "r_mbytes_per_sec": 0, 00:16:42.132 "w_mbytes_per_sec": 0 00:16:42.132 }, 00:16:42.132 "claimed": true, 00:16:42.132 "claim_type": "exclusive_write", 00:16:42.132 "zoned": false, 00:16:42.132 "supported_io_types": { 00:16:42.132 "read": true, 00:16:42.132 "write": true, 00:16:42.132 "unmap": true, 00:16:42.132 "flush": true, 00:16:42.132 "reset": true, 00:16:42.132 "nvme_admin": false, 00:16:42.132 "nvme_io": false, 00:16:42.132 "nvme_io_md": false, 00:16:42.132 "write_zeroes": true, 00:16:42.132 "zcopy": true, 00:16:42.132 "get_zone_info": false, 00:16:42.132 "zone_management": false, 00:16:42.132 "zone_append": false, 00:16:42.132 "compare": false, 00:16:42.132 "compare_and_write": false, 00:16:42.132 "abort": true, 00:16:42.132 "seek_hole": false, 00:16:42.132 "seek_data": false, 00:16:42.132 "copy": true, 00:16:42.132 "nvme_iov_md": false 00:16:42.132 }, 00:16:42.132 "memory_domains": [ 00:16:42.132 { 00:16:42.132 "dma_device_id": "system", 00:16:42.132 "dma_device_type": 1 00:16:42.132 }, 00:16:42.132 { 00:16:42.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.132 "dma_device_type": 2 00:16:42.132 } 00:16:42.132 ], 00:16:42.132 "driver_specific": {} 00:16:42.132 } 00:16:42.132 ] 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.132 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.391 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.391 "name": "Existed_Raid", 00:16:42.391 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:42.391 "strip_size_kb": 64, 00:16:42.391 "state": "configuring", 00:16:42.391 "raid_level": "concat", 00:16:42.391 "superblock": true, 00:16:42.391 "num_base_bdevs": 4, 00:16:42.391 "num_base_bdevs_discovered": 3, 00:16:42.391 "num_base_bdevs_operational": 4, 00:16:42.391 "base_bdevs_list": [ 00:16:42.391 { 00:16:42.391 "name": "BaseBdev1", 00:16:42.391 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:42.391 "is_configured": true, 00:16:42.391 "data_offset": 2048, 00:16:42.391 "data_size": 63488 00:16:42.391 }, 00:16:42.391 { 00:16:42.391 "name": null, 00:16:42.391 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:42.391 "is_configured": false, 00:16:42.391 "data_offset": 2048, 00:16:42.391 "data_size": 63488 00:16:42.391 }, 00:16:42.391 { 00:16:42.391 "name": "BaseBdev3", 00:16:42.391 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:42.391 "is_configured": true, 00:16:42.391 "data_offset": 2048, 00:16:42.391 "data_size": 63488 00:16:42.391 }, 00:16:42.391 { 00:16:42.391 "name": "BaseBdev4", 00:16:42.391 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:42.391 "is_configured": true, 00:16:42.391 "data_offset": 2048, 00:16:42.391 "data_size": 63488 00:16:42.391 } 00:16:42.391 ] 00:16:42.391 }' 00:16:42.391 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.391 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.957 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.957 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:42.957 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:42.957 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:43.215 [2024-07-15 13:39:30.691016] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.215 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.473 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.473 "name": "Existed_Raid", 00:16:43.473 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:43.473 "strip_size_kb": 64, 00:16:43.473 "state": "configuring", 00:16:43.473 "raid_level": "concat", 00:16:43.473 "superblock": true, 00:16:43.473 "num_base_bdevs": 4, 00:16:43.473 "num_base_bdevs_discovered": 2, 00:16:43.473 "num_base_bdevs_operational": 4, 00:16:43.473 "base_bdevs_list": [ 00:16:43.473 { 00:16:43.473 "name": "BaseBdev1", 00:16:43.473 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:43.473 "is_configured": true, 00:16:43.473 "data_offset": 2048, 00:16:43.473 "data_size": 63488 00:16:43.473 }, 00:16:43.473 { 00:16:43.473 "name": null, 00:16:43.473 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:43.473 "is_configured": false, 00:16:43.473 "data_offset": 2048, 00:16:43.473 "data_size": 63488 00:16:43.473 }, 00:16:43.473 { 00:16:43.473 "name": null, 00:16:43.473 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:43.473 "is_configured": false, 00:16:43.473 "data_offset": 2048, 00:16:43.473 "data_size": 63488 00:16:43.473 }, 00:16:43.473 { 00:16:43.473 "name": "BaseBdev4", 00:16:43.473 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:43.473 "is_configured": true, 00:16:43.473 "data_offset": 2048, 00:16:43.473 "data_size": 63488 00:16:43.473 } 00:16:43.473 ] 00:16:43.473 }' 00:16:43.473 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.473 13:39:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:44.039 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:44.039 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.039 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:44.039 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:44.297 [2024-07-15 13:39:31.721686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.297 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.554 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.555 "name": "Existed_Raid", 00:16:44.555 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:44.555 "strip_size_kb": 64, 00:16:44.555 "state": "configuring", 00:16:44.555 "raid_level": "concat", 00:16:44.555 "superblock": true, 00:16:44.555 "num_base_bdevs": 4, 00:16:44.555 "num_base_bdevs_discovered": 3, 00:16:44.555 "num_base_bdevs_operational": 4, 00:16:44.555 "base_bdevs_list": [ 00:16:44.555 { 00:16:44.555 "name": "BaseBdev1", 00:16:44.555 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:44.555 "is_configured": true, 00:16:44.555 "data_offset": 2048, 00:16:44.555 "data_size": 63488 00:16:44.555 }, 00:16:44.555 { 00:16:44.555 "name": null, 00:16:44.555 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:44.555 "is_configured": false, 00:16:44.555 "data_offset": 2048, 00:16:44.555 "data_size": 63488 00:16:44.555 }, 00:16:44.555 { 00:16:44.555 "name": "BaseBdev3", 00:16:44.555 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:44.555 "is_configured": true, 00:16:44.555 "data_offset": 2048, 00:16:44.555 "data_size": 63488 00:16:44.555 }, 00:16:44.555 { 00:16:44.555 "name": "BaseBdev4", 00:16:44.555 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:44.555 "is_configured": true, 00:16:44.555 "data_offset": 2048, 00:16:44.555 "data_size": 63488 00:16:44.555 } 00:16:44.555 ] 00:16:44.555 }' 00:16:44.555 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.555 13:39:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.135 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:45.135 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.135 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:45.135 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:45.397 [2024-07-15 13:39:32.768389] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.397 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.397 "name": "Existed_Raid", 00:16:45.397 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:45.397 "strip_size_kb": 64, 00:16:45.397 "state": "configuring", 00:16:45.397 "raid_level": "concat", 00:16:45.397 "superblock": true, 00:16:45.397 "num_base_bdevs": 4, 00:16:45.398 "num_base_bdevs_discovered": 2, 00:16:45.398 "num_base_bdevs_operational": 4, 00:16:45.398 "base_bdevs_list": [ 00:16:45.398 { 00:16:45.398 "name": null, 00:16:45.398 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:45.398 "is_configured": false, 00:16:45.398 "data_offset": 2048, 00:16:45.398 "data_size": 63488 00:16:45.398 }, 00:16:45.398 { 00:16:45.398 "name": null, 00:16:45.398 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:45.398 "is_configured": false, 00:16:45.398 "data_offset": 2048, 00:16:45.398 "data_size": 63488 00:16:45.398 }, 00:16:45.398 { 00:16:45.398 "name": "BaseBdev3", 00:16:45.398 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:45.398 "is_configured": true, 00:16:45.398 "data_offset": 2048, 00:16:45.398 "data_size": 63488 00:16:45.398 }, 00:16:45.398 { 00:16:45.398 "name": "BaseBdev4", 00:16:45.398 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:45.398 "is_configured": true, 00:16:45.398 "data_offset": 2048, 00:16:45.398 "data_size": 63488 00:16:45.398 } 00:16:45.398 ] 00:16:45.398 }' 00:16:45.398 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.398 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.961 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.961 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:46.217 [2024-07-15 13:39:33.795092] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.217 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.218 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.218 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.218 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.218 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.475 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.475 "name": "Existed_Raid", 00:16:46.475 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:46.475 "strip_size_kb": 64, 00:16:46.475 "state": "configuring", 00:16:46.475 "raid_level": "concat", 00:16:46.475 "superblock": true, 00:16:46.475 "num_base_bdevs": 4, 00:16:46.475 "num_base_bdevs_discovered": 3, 00:16:46.475 "num_base_bdevs_operational": 4, 00:16:46.475 "base_bdevs_list": [ 00:16:46.475 { 00:16:46.475 "name": null, 00:16:46.475 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:46.475 "is_configured": false, 00:16:46.475 "data_offset": 2048, 00:16:46.475 "data_size": 63488 00:16:46.475 }, 00:16:46.475 { 00:16:46.475 "name": "BaseBdev2", 00:16:46.475 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:46.475 "is_configured": true, 00:16:46.475 "data_offset": 2048, 00:16:46.475 "data_size": 63488 00:16:46.475 }, 00:16:46.475 { 00:16:46.475 "name": "BaseBdev3", 00:16:46.475 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:46.475 "is_configured": true, 00:16:46.475 "data_offset": 2048, 00:16:46.475 "data_size": 63488 00:16:46.475 }, 00:16:46.475 { 00:16:46.475 "name": "BaseBdev4", 00:16:46.475 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:46.475 "is_configured": true, 00:16:46.475 "data_offset": 2048, 00:16:46.475 "data_size": 63488 00:16:46.475 } 00:16:46.475 ] 00:16:46.475 }' 00:16:46.475 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.475 13:39:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.041 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.041 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:47.300 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:47.300 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.300 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:47.300 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e37c4776-36c1-49a3-b04b-82d87324cd8a 00:16:47.559 [2024-07-15 13:39:35.005614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:47.559 [2024-07-15 13:39:35.005736] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1402900 00:16:47.559 [2024-07-15 13:39:35.005745] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:47.559 [2024-07-15 13:39:35.005861] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f84b0 00:16:47.559 [2024-07-15 13:39:35.005951] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1402900 00:16:47.559 [2024-07-15 13:39:35.005957] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1402900 00:16:47.559 [2024-07-15 13:39:35.006041] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:47.559 NewBaseBdev 00:16:47.559 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:47.559 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:47.559 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:47.559 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:47.559 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:47.559 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:47.559 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:47.818 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:47.818 [ 00:16:47.818 { 00:16:47.818 "name": "NewBaseBdev", 00:16:47.818 "aliases": [ 00:16:47.818 "e37c4776-36c1-49a3-b04b-82d87324cd8a" 00:16:47.818 ], 00:16:47.818 "product_name": "Malloc disk", 00:16:47.818 "block_size": 512, 00:16:47.818 "num_blocks": 65536, 00:16:47.818 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:47.819 "assigned_rate_limits": { 00:16:47.819 "rw_ios_per_sec": 0, 00:16:47.819 "rw_mbytes_per_sec": 0, 00:16:47.819 "r_mbytes_per_sec": 0, 00:16:47.819 "w_mbytes_per_sec": 0 00:16:47.819 }, 00:16:47.819 "claimed": true, 00:16:47.819 "claim_type": "exclusive_write", 00:16:47.819 "zoned": false, 00:16:47.819 "supported_io_types": { 00:16:47.819 "read": true, 00:16:47.819 "write": true, 00:16:47.819 "unmap": true, 00:16:47.819 "flush": true, 00:16:47.819 "reset": true, 00:16:47.819 "nvme_admin": false, 00:16:47.819 "nvme_io": false, 00:16:47.819 "nvme_io_md": false, 00:16:47.819 "write_zeroes": true, 00:16:47.819 "zcopy": true, 00:16:47.819 "get_zone_info": false, 00:16:47.819 "zone_management": false, 00:16:47.819 "zone_append": false, 00:16:47.819 "compare": false, 00:16:47.819 "compare_and_write": false, 00:16:47.819 "abort": true, 00:16:47.819 "seek_hole": false, 00:16:47.819 "seek_data": false, 00:16:47.819 "copy": true, 00:16:47.819 "nvme_iov_md": false 00:16:47.819 }, 00:16:47.819 "memory_domains": [ 00:16:47.819 { 00:16:47.819 "dma_device_id": "system", 00:16:47.819 "dma_device_type": 1 00:16:47.819 }, 00:16:47.819 { 00:16:47.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.819 "dma_device_type": 2 00:16:47.819 } 00:16:47.819 ], 00:16:47.819 "driver_specific": {} 00:16:47.819 } 00:16:47.819 ] 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.819 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.077 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.077 "name": "Existed_Raid", 00:16:48.077 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:48.077 "strip_size_kb": 64, 00:16:48.077 "state": "online", 00:16:48.077 "raid_level": "concat", 00:16:48.077 "superblock": true, 00:16:48.077 "num_base_bdevs": 4, 00:16:48.077 "num_base_bdevs_discovered": 4, 00:16:48.077 "num_base_bdevs_operational": 4, 00:16:48.077 "base_bdevs_list": [ 00:16:48.077 { 00:16:48.077 "name": "NewBaseBdev", 00:16:48.077 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:48.077 "is_configured": true, 00:16:48.077 "data_offset": 2048, 00:16:48.077 "data_size": 63488 00:16:48.077 }, 00:16:48.077 { 00:16:48.077 "name": "BaseBdev2", 00:16:48.077 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:48.077 "is_configured": true, 00:16:48.077 "data_offset": 2048, 00:16:48.077 "data_size": 63488 00:16:48.077 }, 00:16:48.077 { 00:16:48.077 "name": "BaseBdev3", 00:16:48.077 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:48.077 "is_configured": true, 00:16:48.077 "data_offset": 2048, 00:16:48.077 "data_size": 63488 00:16:48.077 }, 00:16:48.077 { 00:16:48.077 "name": "BaseBdev4", 00:16:48.077 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:48.077 "is_configured": true, 00:16:48.077 "data_offset": 2048, 00:16:48.077 "data_size": 63488 00:16:48.077 } 00:16:48.077 ] 00:16:48.077 }' 00:16:48.077 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.077 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:48.643 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:48.643 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:48.643 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:48.643 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:48.643 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:48.643 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:48.643 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:48.643 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:48.643 [2024-07-15 13:39:36.156814] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:48.643 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:48.643 "name": "Existed_Raid", 00:16:48.643 "aliases": [ 00:16:48.643 "15d77e50-a0d8-457a-998c-42f08ba5f8fe" 00:16:48.643 ], 00:16:48.643 "product_name": "Raid Volume", 00:16:48.643 "block_size": 512, 00:16:48.643 "num_blocks": 253952, 00:16:48.643 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:48.643 "assigned_rate_limits": { 00:16:48.643 "rw_ios_per_sec": 0, 00:16:48.643 "rw_mbytes_per_sec": 0, 00:16:48.643 "r_mbytes_per_sec": 0, 00:16:48.643 "w_mbytes_per_sec": 0 00:16:48.643 }, 00:16:48.643 "claimed": false, 00:16:48.643 "zoned": false, 00:16:48.643 "supported_io_types": { 00:16:48.643 "read": true, 00:16:48.643 "write": true, 00:16:48.643 "unmap": true, 00:16:48.643 "flush": true, 00:16:48.643 "reset": true, 00:16:48.643 "nvme_admin": false, 00:16:48.643 "nvme_io": false, 00:16:48.643 "nvme_io_md": false, 00:16:48.643 "write_zeroes": true, 00:16:48.643 "zcopy": false, 00:16:48.643 "get_zone_info": false, 00:16:48.643 "zone_management": false, 00:16:48.643 "zone_append": false, 00:16:48.643 "compare": false, 00:16:48.643 "compare_and_write": false, 00:16:48.643 "abort": false, 00:16:48.643 "seek_hole": false, 00:16:48.643 "seek_data": false, 00:16:48.643 "copy": false, 00:16:48.643 "nvme_iov_md": false 00:16:48.643 }, 00:16:48.643 "memory_domains": [ 00:16:48.643 { 00:16:48.643 "dma_device_id": "system", 00:16:48.643 "dma_device_type": 1 00:16:48.643 }, 00:16:48.643 { 00:16:48.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.643 "dma_device_type": 2 00:16:48.643 }, 00:16:48.643 { 00:16:48.643 "dma_device_id": "system", 00:16:48.643 "dma_device_type": 1 00:16:48.643 }, 00:16:48.643 { 00:16:48.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.643 "dma_device_type": 2 00:16:48.643 }, 00:16:48.643 { 00:16:48.643 "dma_device_id": "system", 00:16:48.643 "dma_device_type": 1 00:16:48.643 }, 00:16:48.643 { 00:16:48.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.643 "dma_device_type": 2 00:16:48.643 }, 00:16:48.643 { 00:16:48.643 "dma_device_id": "system", 00:16:48.643 "dma_device_type": 1 00:16:48.643 }, 00:16:48.643 { 00:16:48.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.643 "dma_device_type": 2 00:16:48.643 } 00:16:48.643 ], 00:16:48.643 "driver_specific": { 00:16:48.643 "raid": { 00:16:48.643 "uuid": "15d77e50-a0d8-457a-998c-42f08ba5f8fe", 00:16:48.644 "strip_size_kb": 64, 00:16:48.644 "state": "online", 00:16:48.644 "raid_level": "concat", 00:16:48.644 "superblock": true, 00:16:48.644 "num_base_bdevs": 4, 00:16:48.644 "num_base_bdevs_discovered": 4, 00:16:48.644 "num_base_bdevs_operational": 4, 00:16:48.644 "base_bdevs_list": [ 00:16:48.644 { 00:16:48.644 "name": "NewBaseBdev", 00:16:48.644 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:48.644 "is_configured": true, 00:16:48.644 "data_offset": 2048, 00:16:48.644 "data_size": 63488 00:16:48.644 }, 00:16:48.644 { 00:16:48.644 "name": "BaseBdev2", 00:16:48.644 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:48.644 "is_configured": true, 00:16:48.644 "data_offset": 2048, 00:16:48.644 "data_size": 63488 00:16:48.644 }, 00:16:48.644 { 00:16:48.644 "name": "BaseBdev3", 00:16:48.644 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:48.644 "is_configured": true, 00:16:48.644 "data_offset": 2048, 00:16:48.644 "data_size": 63488 00:16:48.644 }, 00:16:48.644 { 00:16:48.644 "name": "BaseBdev4", 00:16:48.644 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:48.644 "is_configured": true, 00:16:48.644 "data_offset": 2048, 00:16:48.644 "data_size": 63488 00:16:48.644 } 00:16:48.644 ] 00:16:48.644 } 00:16:48.644 } 00:16:48.644 }' 00:16:48.644 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:48.644 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:48.644 BaseBdev2 00:16:48.644 BaseBdev3 00:16:48.644 BaseBdev4' 00:16:48.644 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.644 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:48.644 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.901 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.901 "name": "NewBaseBdev", 00:16:48.901 "aliases": [ 00:16:48.901 "e37c4776-36c1-49a3-b04b-82d87324cd8a" 00:16:48.901 ], 00:16:48.901 "product_name": "Malloc disk", 00:16:48.901 "block_size": 512, 00:16:48.901 "num_blocks": 65536, 00:16:48.901 "uuid": "e37c4776-36c1-49a3-b04b-82d87324cd8a", 00:16:48.901 "assigned_rate_limits": { 00:16:48.901 "rw_ios_per_sec": 0, 00:16:48.901 "rw_mbytes_per_sec": 0, 00:16:48.901 "r_mbytes_per_sec": 0, 00:16:48.901 "w_mbytes_per_sec": 0 00:16:48.901 }, 00:16:48.901 "claimed": true, 00:16:48.901 "claim_type": "exclusive_write", 00:16:48.901 "zoned": false, 00:16:48.901 "supported_io_types": { 00:16:48.901 "read": true, 00:16:48.901 "write": true, 00:16:48.901 "unmap": true, 00:16:48.901 "flush": true, 00:16:48.901 "reset": true, 00:16:48.901 "nvme_admin": false, 00:16:48.901 "nvme_io": false, 00:16:48.901 "nvme_io_md": false, 00:16:48.901 "write_zeroes": true, 00:16:48.901 "zcopy": true, 00:16:48.901 "get_zone_info": false, 00:16:48.901 "zone_management": false, 00:16:48.901 "zone_append": false, 00:16:48.901 "compare": false, 00:16:48.901 "compare_and_write": false, 00:16:48.901 "abort": true, 00:16:48.901 "seek_hole": false, 00:16:48.901 "seek_data": false, 00:16:48.901 "copy": true, 00:16:48.901 "nvme_iov_md": false 00:16:48.901 }, 00:16:48.901 "memory_domains": [ 00:16:48.901 { 00:16:48.901 "dma_device_id": "system", 00:16:48.901 "dma_device_type": 1 00:16:48.901 }, 00:16:48.901 { 00:16:48.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.901 "dma_device_type": 2 00:16:48.901 } 00:16:48.901 ], 00:16:48.901 "driver_specific": {} 00:16:48.901 }' 00:16:48.901 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.901 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.901 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.902 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:49.159 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:49.417 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:49.417 "name": "BaseBdev2", 00:16:49.417 "aliases": [ 00:16:49.417 "58687cbe-6b38-47d7-af2f-68ac22c3a8c0" 00:16:49.417 ], 00:16:49.417 "product_name": "Malloc disk", 00:16:49.417 "block_size": 512, 00:16:49.417 "num_blocks": 65536, 00:16:49.417 "uuid": "58687cbe-6b38-47d7-af2f-68ac22c3a8c0", 00:16:49.417 "assigned_rate_limits": { 00:16:49.417 "rw_ios_per_sec": 0, 00:16:49.417 "rw_mbytes_per_sec": 0, 00:16:49.417 "r_mbytes_per_sec": 0, 00:16:49.417 "w_mbytes_per_sec": 0 00:16:49.417 }, 00:16:49.417 "claimed": true, 00:16:49.417 "claim_type": "exclusive_write", 00:16:49.417 "zoned": false, 00:16:49.417 "supported_io_types": { 00:16:49.417 "read": true, 00:16:49.417 "write": true, 00:16:49.417 "unmap": true, 00:16:49.417 "flush": true, 00:16:49.417 "reset": true, 00:16:49.417 "nvme_admin": false, 00:16:49.417 "nvme_io": false, 00:16:49.417 "nvme_io_md": false, 00:16:49.417 "write_zeroes": true, 00:16:49.417 "zcopy": true, 00:16:49.417 "get_zone_info": false, 00:16:49.417 "zone_management": false, 00:16:49.417 "zone_append": false, 00:16:49.417 "compare": false, 00:16:49.417 "compare_and_write": false, 00:16:49.417 "abort": true, 00:16:49.417 "seek_hole": false, 00:16:49.417 "seek_data": false, 00:16:49.417 "copy": true, 00:16:49.417 "nvme_iov_md": false 00:16:49.417 }, 00:16:49.417 "memory_domains": [ 00:16:49.417 { 00:16:49.417 "dma_device_id": "system", 00:16:49.417 "dma_device_type": 1 00:16:49.417 }, 00:16:49.417 { 00:16:49.417 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.417 "dma_device_type": 2 00:16:49.417 } 00:16:49.417 ], 00:16:49.417 "driver_specific": {} 00:16:49.417 }' 00:16:49.417 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.417 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.417 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:49.417 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.417 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:49.676 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:49.934 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:49.934 "name": "BaseBdev3", 00:16:49.934 "aliases": [ 00:16:49.934 "0ae9b374-6ca9-475b-8a8d-c165e57844a0" 00:16:49.934 ], 00:16:49.934 "product_name": "Malloc disk", 00:16:49.934 "block_size": 512, 00:16:49.934 "num_blocks": 65536, 00:16:49.934 "uuid": "0ae9b374-6ca9-475b-8a8d-c165e57844a0", 00:16:49.934 "assigned_rate_limits": { 00:16:49.934 "rw_ios_per_sec": 0, 00:16:49.934 "rw_mbytes_per_sec": 0, 00:16:49.934 "r_mbytes_per_sec": 0, 00:16:49.934 "w_mbytes_per_sec": 0 00:16:49.934 }, 00:16:49.934 "claimed": true, 00:16:49.934 "claim_type": "exclusive_write", 00:16:49.934 "zoned": false, 00:16:49.934 "supported_io_types": { 00:16:49.934 "read": true, 00:16:49.934 "write": true, 00:16:49.934 "unmap": true, 00:16:49.934 "flush": true, 00:16:49.934 "reset": true, 00:16:49.934 "nvme_admin": false, 00:16:49.934 "nvme_io": false, 00:16:49.934 "nvme_io_md": false, 00:16:49.934 "write_zeroes": true, 00:16:49.934 "zcopy": true, 00:16:49.934 "get_zone_info": false, 00:16:49.934 "zone_management": false, 00:16:49.934 "zone_append": false, 00:16:49.934 "compare": false, 00:16:49.934 "compare_and_write": false, 00:16:49.934 "abort": true, 00:16:49.934 "seek_hole": false, 00:16:49.934 "seek_data": false, 00:16:49.934 "copy": true, 00:16:49.934 "nvme_iov_md": false 00:16:49.934 }, 00:16:49.934 "memory_domains": [ 00:16:49.934 { 00:16:49.934 "dma_device_id": "system", 00:16:49.934 "dma_device_type": 1 00:16:49.934 }, 00:16:49.934 { 00:16:49.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.934 "dma_device_type": 2 00:16:49.934 } 00:16:49.934 ], 00:16:49.934 "driver_specific": {} 00:16:49.934 }' 00:16:49.934 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.934 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.934 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:49.934 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.935 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.935 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.935 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:50.193 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:50.452 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:50.452 "name": "BaseBdev4", 00:16:50.452 "aliases": [ 00:16:50.452 "3655f609-f91b-4d9e-92fe-3c743aa54701" 00:16:50.452 ], 00:16:50.452 "product_name": "Malloc disk", 00:16:50.452 "block_size": 512, 00:16:50.452 "num_blocks": 65536, 00:16:50.452 "uuid": "3655f609-f91b-4d9e-92fe-3c743aa54701", 00:16:50.452 "assigned_rate_limits": { 00:16:50.452 "rw_ios_per_sec": 0, 00:16:50.452 "rw_mbytes_per_sec": 0, 00:16:50.452 "r_mbytes_per_sec": 0, 00:16:50.452 "w_mbytes_per_sec": 0 00:16:50.452 }, 00:16:50.452 "claimed": true, 00:16:50.452 "claim_type": "exclusive_write", 00:16:50.452 "zoned": false, 00:16:50.452 "supported_io_types": { 00:16:50.452 "read": true, 00:16:50.452 "write": true, 00:16:50.452 "unmap": true, 00:16:50.452 "flush": true, 00:16:50.452 "reset": true, 00:16:50.452 "nvme_admin": false, 00:16:50.452 "nvme_io": false, 00:16:50.452 "nvme_io_md": false, 00:16:50.452 "write_zeroes": true, 00:16:50.452 "zcopy": true, 00:16:50.452 "get_zone_info": false, 00:16:50.452 "zone_management": false, 00:16:50.452 "zone_append": false, 00:16:50.452 "compare": false, 00:16:50.452 "compare_and_write": false, 00:16:50.452 "abort": true, 00:16:50.452 "seek_hole": false, 00:16:50.452 "seek_data": false, 00:16:50.452 "copy": true, 00:16:50.452 "nvme_iov_md": false 00:16:50.452 }, 00:16:50.452 "memory_domains": [ 00:16:50.452 { 00:16:50.452 "dma_device_id": "system", 00:16:50.452 "dma_device_type": 1 00:16:50.452 }, 00:16:50.452 { 00:16:50.452 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.452 "dma_device_type": 2 00:16:50.452 } 00:16:50.452 ], 00:16:50.452 "driver_specific": {} 00:16:50.452 }' 00:16:50.452 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.452 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.452 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:50.452 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.452 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.452 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:50.452 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.452 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.452 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:50.452 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.711 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.711 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:50.711 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:50.711 [2024-07-15 13:39:38.306147] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:50.711 [2024-07-15 13:39:38.306169] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:50.711 [2024-07-15 13:39:38.306206] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:50.711 [2024-07-15 13:39:38.306249] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:50.711 [2024-07-15 13:39:38.306257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1402900 name Existed_Raid, state offline 00:16:50.711 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 37426 00:16:50.711 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 37426 ']' 00:16:50.711 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 37426 00:16:50.711 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:50.970 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:50.970 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 37426 00:16:50.970 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:50.970 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:50.970 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 37426' 00:16:50.970 killing process with pid 37426 00:16:50.970 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 37426 00:16:50.970 [2024-07-15 13:39:38.374227] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:50.970 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 37426 00:16:50.970 [2024-07-15 13:39:38.409812] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:51.229 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:51.229 00:16:51.229 real 0m24.960s 00:16:51.229 user 0m45.375s 00:16:51.229 sys 0m4.918s 00:16:51.229 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:51.229 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:51.229 ************************************ 00:16:51.229 END TEST raid_state_function_test_sb 00:16:51.229 ************************************ 00:16:51.229 13:39:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:51.229 13:39:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:16:51.229 13:39:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:51.229 13:39:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:51.229 13:39:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:51.229 ************************************ 00:16:51.229 START TEST raid_superblock_test 00:16:51.229 ************************************ 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=41408 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 41408 /var/tmp/spdk-raid.sock 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 41408 ']' 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:51.229 13:39:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:51.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:51.230 13:39:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:51.230 13:39:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.230 [2024-07-15 13:39:38.754842] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:16:51.230 [2024-07-15 13:39:38.754900] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid41408 ] 00:16:51.230 [2024-07-15 13:39:38.840811] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:51.487 [2024-07-15 13:39:38.923455] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:51.487 [2024-07-15 13:39:38.975277] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:51.487 [2024-07-15 13:39:38.975307] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.053 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:52.310 malloc1 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:52.310 [2024-07-15 13:39:39.894651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:52.310 [2024-07-15 13:39:39.894692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.310 [2024-07-15 13:39:39.894708] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b97260 00:16:52.310 [2024-07-15 13:39:39.894717] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.310 [2024-07-15 13:39:39.895942] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.310 [2024-07-15 13:39:39.895966] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:52.310 pt1 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.310 13:39:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:52.567 malloc2 00:16:52.567 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:52.823 [2024-07-15 13:39:40.267695] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:52.823 [2024-07-15 13:39:40.267740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.823 [2024-07-15 13:39:40.267753] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d41310 00:16:52.823 [2024-07-15 13:39:40.267762] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.823 [2024-07-15 13:39:40.268833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.823 [2024-07-15 13:39:40.268857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:52.823 pt2 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.823 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:53.080 malloc3 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:53.080 [2024-07-15 13:39:40.624459] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:53.080 [2024-07-15 13:39:40.624496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:53.080 [2024-07-15 13:39:40.624524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d44e70 00:16:53.080 [2024-07-15 13:39:40.624533] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:53.080 [2024-07-15 13:39:40.625499] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:53.080 [2024-07-15 13:39:40.625520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:53.080 pt3 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:53.080 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:53.337 malloc4 00:16:53.337 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:53.594 [2024-07-15 13:39:40.977232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:53.594 [2024-07-15 13:39:40.977267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:53.594 [2024-07-15 13:39:40.977299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d41d40 00:16:53.594 [2024-07-15 13:39:40.977307] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:53.594 [2024-07-15 13:39:40.978292] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:53.594 [2024-07-15 13:39:40.978312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:53.594 pt4 00:16:53.594 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:53.594 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:53.594 13:39:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:53.594 [2024-07-15 13:39:41.149703] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:53.594 [2024-07-15 13:39:41.150544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:53.594 [2024-07-15 13:39:41.150585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:53.594 [2024-07-15 13:39:41.150615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:53.594 [2024-07-15 13:39:41.150730] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d45180 00:16:53.594 [2024-07-15 13:39:41.150738] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:53.594 [2024-07-15 13:39:41.150876] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d49580 00:16:53.594 [2024-07-15 13:39:41.150973] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d45180 00:16:53.594 [2024-07-15 13:39:41.150980] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d45180 00:16:53.594 [2024-07-15 13:39:41.151063] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.594 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:53.850 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.850 "name": "raid_bdev1", 00:16:53.850 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:16:53.850 "strip_size_kb": 64, 00:16:53.850 "state": "online", 00:16:53.850 "raid_level": "concat", 00:16:53.850 "superblock": true, 00:16:53.850 "num_base_bdevs": 4, 00:16:53.850 "num_base_bdevs_discovered": 4, 00:16:53.850 "num_base_bdevs_operational": 4, 00:16:53.850 "base_bdevs_list": [ 00:16:53.850 { 00:16:53.850 "name": "pt1", 00:16:53.850 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:53.850 "is_configured": true, 00:16:53.850 "data_offset": 2048, 00:16:53.850 "data_size": 63488 00:16:53.850 }, 00:16:53.850 { 00:16:53.850 "name": "pt2", 00:16:53.850 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:53.850 "is_configured": true, 00:16:53.850 "data_offset": 2048, 00:16:53.850 "data_size": 63488 00:16:53.850 }, 00:16:53.850 { 00:16:53.850 "name": "pt3", 00:16:53.850 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:53.850 "is_configured": true, 00:16:53.850 "data_offset": 2048, 00:16:53.850 "data_size": 63488 00:16:53.850 }, 00:16:53.850 { 00:16:53.850 "name": "pt4", 00:16:53.850 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:53.850 "is_configured": true, 00:16:53.850 "data_offset": 2048, 00:16:53.850 "data_size": 63488 00:16:53.850 } 00:16:53.850 ] 00:16:53.850 }' 00:16:53.850 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.850 13:39:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:54.413 [2024-07-15 13:39:41.968026] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:54.413 "name": "raid_bdev1", 00:16:54.413 "aliases": [ 00:16:54.413 "6eb8433a-9eac-4763-88e3-5988ada403f7" 00:16:54.413 ], 00:16:54.413 "product_name": "Raid Volume", 00:16:54.413 "block_size": 512, 00:16:54.413 "num_blocks": 253952, 00:16:54.413 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:16:54.413 "assigned_rate_limits": { 00:16:54.413 "rw_ios_per_sec": 0, 00:16:54.413 "rw_mbytes_per_sec": 0, 00:16:54.413 "r_mbytes_per_sec": 0, 00:16:54.413 "w_mbytes_per_sec": 0 00:16:54.413 }, 00:16:54.413 "claimed": false, 00:16:54.413 "zoned": false, 00:16:54.413 "supported_io_types": { 00:16:54.413 "read": true, 00:16:54.413 "write": true, 00:16:54.413 "unmap": true, 00:16:54.413 "flush": true, 00:16:54.413 "reset": true, 00:16:54.413 "nvme_admin": false, 00:16:54.413 "nvme_io": false, 00:16:54.413 "nvme_io_md": false, 00:16:54.413 "write_zeroes": true, 00:16:54.413 "zcopy": false, 00:16:54.413 "get_zone_info": false, 00:16:54.413 "zone_management": false, 00:16:54.413 "zone_append": false, 00:16:54.413 "compare": false, 00:16:54.413 "compare_and_write": false, 00:16:54.413 "abort": false, 00:16:54.413 "seek_hole": false, 00:16:54.413 "seek_data": false, 00:16:54.413 "copy": false, 00:16:54.413 "nvme_iov_md": false 00:16:54.413 }, 00:16:54.413 "memory_domains": [ 00:16:54.413 { 00:16:54.413 "dma_device_id": "system", 00:16:54.413 "dma_device_type": 1 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.413 "dma_device_type": 2 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "dma_device_id": "system", 00:16:54.413 "dma_device_type": 1 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.413 "dma_device_type": 2 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "dma_device_id": "system", 00:16:54.413 "dma_device_type": 1 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.413 "dma_device_type": 2 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "dma_device_id": "system", 00:16:54.413 "dma_device_type": 1 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.413 "dma_device_type": 2 00:16:54.413 } 00:16:54.413 ], 00:16:54.413 "driver_specific": { 00:16:54.413 "raid": { 00:16:54.413 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:16:54.413 "strip_size_kb": 64, 00:16:54.413 "state": "online", 00:16:54.413 "raid_level": "concat", 00:16:54.413 "superblock": true, 00:16:54.413 "num_base_bdevs": 4, 00:16:54.413 "num_base_bdevs_discovered": 4, 00:16:54.413 "num_base_bdevs_operational": 4, 00:16:54.413 "base_bdevs_list": [ 00:16:54.413 { 00:16:54.413 "name": "pt1", 00:16:54.413 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:54.413 "is_configured": true, 00:16:54.413 "data_offset": 2048, 00:16:54.413 "data_size": 63488 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "name": "pt2", 00:16:54.413 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:54.413 "is_configured": true, 00:16:54.413 "data_offset": 2048, 00:16:54.413 "data_size": 63488 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "name": "pt3", 00:16:54.413 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:54.413 "is_configured": true, 00:16:54.413 "data_offset": 2048, 00:16:54.413 "data_size": 63488 00:16:54.413 }, 00:16:54.413 { 00:16:54.413 "name": "pt4", 00:16:54.413 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:54.413 "is_configured": true, 00:16:54.413 "data_offset": 2048, 00:16:54.413 "data_size": 63488 00:16:54.413 } 00:16:54.413 ] 00:16:54.413 } 00:16:54.413 } 00:16:54.413 }' 00:16:54.413 13:39:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:54.413 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:54.413 pt2 00:16:54.413 pt3 00:16:54.413 pt4' 00:16:54.413 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.413 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:54.413 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.670 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.670 "name": "pt1", 00:16:54.670 "aliases": [ 00:16:54.670 "00000000-0000-0000-0000-000000000001" 00:16:54.670 ], 00:16:54.670 "product_name": "passthru", 00:16:54.670 "block_size": 512, 00:16:54.670 "num_blocks": 65536, 00:16:54.670 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:54.670 "assigned_rate_limits": { 00:16:54.670 "rw_ios_per_sec": 0, 00:16:54.670 "rw_mbytes_per_sec": 0, 00:16:54.670 "r_mbytes_per_sec": 0, 00:16:54.670 "w_mbytes_per_sec": 0 00:16:54.670 }, 00:16:54.670 "claimed": true, 00:16:54.670 "claim_type": "exclusive_write", 00:16:54.670 "zoned": false, 00:16:54.670 "supported_io_types": { 00:16:54.670 "read": true, 00:16:54.670 "write": true, 00:16:54.670 "unmap": true, 00:16:54.670 "flush": true, 00:16:54.670 "reset": true, 00:16:54.671 "nvme_admin": false, 00:16:54.671 "nvme_io": false, 00:16:54.671 "nvme_io_md": false, 00:16:54.671 "write_zeroes": true, 00:16:54.671 "zcopy": true, 00:16:54.671 "get_zone_info": false, 00:16:54.671 "zone_management": false, 00:16:54.671 "zone_append": false, 00:16:54.671 "compare": false, 00:16:54.671 "compare_and_write": false, 00:16:54.671 "abort": true, 00:16:54.671 "seek_hole": false, 00:16:54.671 "seek_data": false, 00:16:54.671 "copy": true, 00:16:54.671 "nvme_iov_md": false 00:16:54.671 }, 00:16:54.671 "memory_domains": [ 00:16:54.671 { 00:16:54.671 "dma_device_id": "system", 00:16:54.671 "dma_device_type": 1 00:16:54.671 }, 00:16:54.671 { 00:16:54.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.671 "dma_device_type": 2 00:16:54.671 } 00:16:54.671 ], 00:16:54.671 "driver_specific": { 00:16:54.671 "passthru": { 00:16:54.671 "name": "pt1", 00:16:54.671 "base_bdev_name": "malloc1" 00:16:54.671 } 00:16:54.671 } 00:16:54.671 }' 00:16:54.671 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.671 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.671 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.671 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.928 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:55.184 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.184 "name": "pt2", 00:16:55.184 "aliases": [ 00:16:55.184 "00000000-0000-0000-0000-000000000002" 00:16:55.184 ], 00:16:55.184 "product_name": "passthru", 00:16:55.184 "block_size": 512, 00:16:55.184 "num_blocks": 65536, 00:16:55.184 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:55.184 "assigned_rate_limits": { 00:16:55.184 "rw_ios_per_sec": 0, 00:16:55.184 "rw_mbytes_per_sec": 0, 00:16:55.184 "r_mbytes_per_sec": 0, 00:16:55.184 "w_mbytes_per_sec": 0 00:16:55.184 }, 00:16:55.184 "claimed": true, 00:16:55.184 "claim_type": "exclusive_write", 00:16:55.185 "zoned": false, 00:16:55.185 "supported_io_types": { 00:16:55.185 "read": true, 00:16:55.185 "write": true, 00:16:55.185 "unmap": true, 00:16:55.185 "flush": true, 00:16:55.185 "reset": true, 00:16:55.185 "nvme_admin": false, 00:16:55.185 "nvme_io": false, 00:16:55.185 "nvme_io_md": false, 00:16:55.185 "write_zeroes": true, 00:16:55.185 "zcopy": true, 00:16:55.185 "get_zone_info": false, 00:16:55.185 "zone_management": false, 00:16:55.185 "zone_append": false, 00:16:55.185 "compare": false, 00:16:55.185 "compare_and_write": false, 00:16:55.185 "abort": true, 00:16:55.185 "seek_hole": false, 00:16:55.185 "seek_data": false, 00:16:55.185 "copy": true, 00:16:55.185 "nvme_iov_md": false 00:16:55.185 }, 00:16:55.185 "memory_domains": [ 00:16:55.185 { 00:16:55.185 "dma_device_id": "system", 00:16:55.185 "dma_device_type": 1 00:16:55.185 }, 00:16:55.185 { 00:16:55.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.185 "dma_device_type": 2 00:16:55.185 } 00:16:55.185 ], 00:16:55.185 "driver_specific": { 00:16:55.185 "passthru": { 00:16:55.185 "name": "pt2", 00:16:55.185 "base_bdev_name": "malloc2" 00:16:55.185 } 00:16:55.185 } 00:16:55.185 }' 00:16:55.185 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.185 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.185 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.185 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.185 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:55.441 13:39:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.706 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.706 "name": "pt3", 00:16:55.706 "aliases": [ 00:16:55.706 "00000000-0000-0000-0000-000000000003" 00:16:55.706 ], 00:16:55.707 "product_name": "passthru", 00:16:55.707 "block_size": 512, 00:16:55.707 "num_blocks": 65536, 00:16:55.707 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:55.707 "assigned_rate_limits": { 00:16:55.707 "rw_ios_per_sec": 0, 00:16:55.707 "rw_mbytes_per_sec": 0, 00:16:55.707 "r_mbytes_per_sec": 0, 00:16:55.707 "w_mbytes_per_sec": 0 00:16:55.707 }, 00:16:55.707 "claimed": true, 00:16:55.707 "claim_type": "exclusive_write", 00:16:55.707 "zoned": false, 00:16:55.707 "supported_io_types": { 00:16:55.707 "read": true, 00:16:55.707 "write": true, 00:16:55.707 "unmap": true, 00:16:55.707 "flush": true, 00:16:55.707 "reset": true, 00:16:55.707 "nvme_admin": false, 00:16:55.707 "nvme_io": false, 00:16:55.707 "nvme_io_md": false, 00:16:55.707 "write_zeroes": true, 00:16:55.707 "zcopy": true, 00:16:55.707 "get_zone_info": false, 00:16:55.707 "zone_management": false, 00:16:55.707 "zone_append": false, 00:16:55.707 "compare": false, 00:16:55.707 "compare_and_write": false, 00:16:55.707 "abort": true, 00:16:55.707 "seek_hole": false, 00:16:55.707 "seek_data": false, 00:16:55.707 "copy": true, 00:16:55.707 "nvme_iov_md": false 00:16:55.707 }, 00:16:55.707 "memory_domains": [ 00:16:55.707 { 00:16:55.707 "dma_device_id": "system", 00:16:55.707 "dma_device_type": 1 00:16:55.707 }, 00:16:55.707 { 00:16:55.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.707 "dma_device_type": 2 00:16:55.707 } 00:16:55.707 ], 00:16:55.707 "driver_specific": { 00:16:55.707 "passthru": { 00:16:55.707 "name": "pt3", 00:16:55.707 "base_bdev_name": "malloc3" 00:16:55.707 } 00:16:55.707 } 00:16:55.707 }' 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.707 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.963 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.963 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.963 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.963 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.963 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.963 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.963 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:56.220 "name": "pt4", 00:16:56.220 "aliases": [ 00:16:56.220 "00000000-0000-0000-0000-000000000004" 00:16:56.220 ], 00:16:56.220 "product_name": "passthru", 00:16:56.220 "block_size": 512, 00:16:56.220 "num_blocks": 65536, 00:16:56.220 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:56.220 "assigned_rate_limits": { 00:16:56.220 "rw_ios_per_sec": 0, 00:16:56.220 "rw_mbytes_per_sec": 0, 00:16:56.220 "r_mbytes_per_sec": 0, 00:16:56.220 "w_mbytes_per_sec": 0 00:16:56.220 }, 00:16:56.220 "claimed": true, 00:16:56.220 "claim_type": "exclusive_write", 00:16:56.220 "zoned": false, 00:16:56.220 "supported_io_types": { 00:16:56.220 "read": true, 00:16:56.220 "write": true, 00:16:56.220 "unmap": true, 00:16:56.220 "flush": true, 00:16:56.220 "reset": true, 00:16:56.220 "nvme_admin": false, 00:16:56.220 "nvme_io": false, 00:16:56.220 "nvme_io_md": false, 00:16:56.220 "write_zeroes": true, 00:16:56.220 "zcopy": true, 00:16:56.220 "get_zone_info": false, 00:16:56.220 "zone_management": false, 00:16:56.220 "zone_append": false, 00:16:56.220 "compare": false, 00:16:56.220 "compare_and_write": false, 00:16:56.220 "abort": true, 00:16:56.220 "seek_hole": false, 00:16:56.220 "seek_data": false, 00:16:56.220 "copy": true, 00:16:56.220 "nvme_iov_md": false 00:16:56.220 }, 00:16:56.220 "memory_domains": [ 00:16:56.220 { 00:16:56.220 "dma_device_id": "system", 00:16:56.220 "dma_device_type": 1 00:16:56.220 }, 00:16:56.220 { 00:16:56.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.220 "dma_device_type": 2 00:16:56.220 } 00:16:56.220 ], 00:16:56.220 "driver_specific": { 00:16:56.220 "passthru": { 00:16:56.220 "name": "pt4", 00:16:56.220 "base_bdev_name": "malloc4" 00:16:56.220 } 00:16:56.220 } 00:16:56.220 }' 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.220 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.477 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:56.477 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.477 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.477 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:56.477 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:56.477 13:39:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:56.735 [2024-07-15 13:39:44.101548] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:56.735 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=6eb8433a-9eac-4763-88e3-5988ada403f7 00:16:56.735 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 6eb8433a-9eac-4763-88e3-5988ada403f7 ']' 00:16:56.735 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:56.735 [2024-07-15 13:39:44.285799] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:56.735 [2024-07-15 13:39:44.285818] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:56.735 [2024-07-15 13:39:44.285860] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:56.735 [2024-07-15 13:39:44.285908] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:56.735 [2024-07-15 13:39:44.285916] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d45180 name raid_bdev1, state offline 00:16:56.735 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.735 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:56.993 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:56.993 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:56.993 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:56.993 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:57.250 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.250 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:57.250 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.250 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:57.507 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.507 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:57.764 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:58.022 [2024-07-15 13:39:45.500908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:58.022 [2024-07-15 13:39:45.501957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:58.022 [2024-07-15 13:39:45.501991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:58.022 [2024-07-15 13:39:45.502022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:58.022 [2024-07-15 13:39:45.502059] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:58.022 [2024-07-15 13:39:45.502089] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:58.022 [2024-07-15 13:39:45.502121] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:58.022 [2024-07-15 13:39:45.502136] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:58.022 [2024-07-15 13:39:45.502149] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:58.022 [2024-07-15 13:39:45.502156] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d41540 name raid_bdev1, state configuring 00:16:58.022 request: 00:16:58.022 { 00:16:58.022 "name": "raid_bdev1", 00:16:58.022 "raid_level": "concat", 00:16:58.022 "base_bdevs": [ 00:16:58.022 "malloc1", 00:16:58.022 "malloc2", 00:16:58.022 "malloc3", 00:16:58.022 "malloc4" 00:16:58.022 ], 00:16:58.022 "strip_size_kb": 64, 00:16:58.022 "superblock": false, 00:16:58.022 "method": "bdev_raid_create", 00:16:58.022 "req_id": 1 00:16:58.022 } 00:16:58.022 Got JSON-RPC error response 00:16:58.022 response: 00:16:58.022 { 00:16:58.022 "code": -17, 00:16:58.022 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:58.022 } 00:16:58.022 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:58.022 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:58.022 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:58.022 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:58.022 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:58.022 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:58.280 [2024-07-15 13:39:45.853779] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:58.280 [2024-07-15 13:39:45.853819] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.280 [2024-07-15 13:39:45.853836] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d43dc0 00:16:58.280 [2024-07-15 13:39:45.853844] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.280 [2024-07-15 13:39:45.855126] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.280 [2024-07-15 13:39:45.855153] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:58.280 [2024-07-15 13:39:45.855208] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:58.280 [2024-07-15 13:39:45.855233] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:58.280 pt1 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.280 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.537 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.537 "name": "raid_bdev1", 00:16:58.537 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:16:58.537 "strip_size_kb": 64, 00:16:58.537 "state": "configuring", 00:16:58.537 "raid_level": "concat", 00:16:58.537 "superblock": true, 00:16:58.537 "num_base_bdevs": 4, 00:16:58.537 "num_base_bdevs_discovered": 1, 00:16:58.537 "num_base_bdevs_operational": 4, 00:16:58.537 "base_bdevs_list": [ 00:16:58.537 { 00:16:58.537 "name": "pt1", 00:16:58.537 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.537 "is_configured": true, 00:16:58.537 "data_offset": 2048, 00:16:58.537 "data_size": 63488 00:16:58.537 }, 00:16:58.537 { 00:16:58.537 "name": null, 00:16:58.537 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.537 "is_configured": false, 00:16:58.537 "data_offset": 2048, 00:16:58.537 "data_size": 63488 00:16:58.537 }, 00:16:58.537 { 00:16:58.537 "name": null, 00:16:58.537 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.537 "is_configured": false, 00:16:58.537 "data_offset": 2048, 00:16:58.537 "data_size": 63488 00:16:58.537 }, 00:16:58.537 { 00:16:58.537 "name": null, 00:16:58.537 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:58.537 "is_configured": false, 00:16:58.537 "data_offset": 2048, 00:16:58.537 "data_size": 63488 00:16:58.537 } 00:16:58.537 ] 00:16:58.537 }' 00:16:58.537 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.537 13:39:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.100 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:16:59.100 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:59.100 [2024-07-15 13:39:46.704120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:59.100 [2024-07-15 13:39:46.704158] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:59.100 [2024-07-15 13:39:46.704174] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d440c0 00:16:59.100 [2024-07-15 13:39:46.704183] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:59.100 [2024-07-15 13:39:46.704448] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:59.100 [2024-07-15 13:39:46.704460] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:59.100 [2024-07-15 13:39:46.704507] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:59.100 [2024-07-15 13:39:46.704521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:59.100 pt2 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:59.358 [2024-07-15 13:39:46.888606] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.358 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:59.615 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.615 "name": "raid_bdev1", 00:16:59.615 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:16:59.615 "strip_size_kb": 64, 00:16:59.615 "state": "configuring", 00:16:59.615 "raid_level": "concat", 00:16:59.615 "superblock": true, 00:16:59.615 "num_base_bdevs": 4, 00:16:59.615 "num_base_bdevs_discovered": 1, 00:16:59.615 "num_base_bdevs_operational": 4, 00:16:59.615 "base_bdevs_list": [ 00:16:59.615 { 00:16:59.615 "name": "pt1", 00:16:59.615 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.615 "is_configured": true, 00:16:59.615 "data_offset": 2048, 00:16:59.615 "data_size": 63488 00:16:59.615 }, 00:16:59.615 { 00:16:59.615 "name": null, 00:16:59.615 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:59.615 "is_configured": false, 00:16:59.615 "data_offset": 2048, 00:16:59.615 "data_size": 63488 00:16:59.615 }, 00:16:59.615 { 00:16:59.615 "name": null, 00:16:59.615 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.615 "is_configured": false, 00:16:59.615 "data_offset": 2048, 00:16:59.615 "data_size": 63488 00:16:59.615 }, 00:16:59.615 { 00:16:59.615 "name": null, 00:16:59.615 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:59.615 "is_configured": false, 00:16:59.615 "data_offset": 2048, 00:16:59.615 "data_size": 63488 00:16:59.615 } 00:16:59.615 ] 00:16:59.615 }' 00:16:59.615 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.615 13:39:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.179 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:00.179 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.179 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:00.179 [2024-07-15 13:39:47.750805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:00.179 [2024-07-15 13:39:47.750839] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.179 [2024-07-15 13:39:47.750869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d442f0 00:17:00.179 [2024-07-15 13:39:47.750877] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.179 [2024-07-15 13:39:47.751140] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.179 [2024-07-15 13:39:47.751152] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:00.179 [2024-07-15 13:39:47.751196] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:00.179 [2024-07-15 13:39:47.751210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:00.179 pt2 00:17:00.179 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:00.179 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.179 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:00.438 [2024-07-15 13:39:47.923256] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:00.438 [2024-07-15 13:39:47.923286] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.438 [2024-07-15 13:39:47.923297] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b961e0 00:17:00.438 [2024-07-15 13:39:47.923305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.438 [2024-07-15 13:39:47.923525] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.438 [2024-07-15 13:39:47.923537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:00.438 [2024-07-15 13:39:47.923573] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:00.438 [2024-07-15 13:39:47.923585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:00.438 pt3 00:17:00.438 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:00.438 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.438 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:00.696 [2024-07-15 13:39:48.103726] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:00.696 [2024-07-15 13:39:48.103750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.696 [2024-07-15 13:39:48.103761] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d47310 00:17:00.696 [2024-07-15 13:39:48.103769] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.696 [2024-07-15 13:39:48.103977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.696 [2024-07-15 13:39:48.103988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:00.696 [2024-07-15 13:39:48.104029] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:00.696 [2024-07-15 13:39:48.104042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:00.696 [2024-07-15 13:39:48.104125] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d43800 00:17:00.696 [2024-07-15 13:39:48.104132] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:00.696 [2024-07-15 13:39:48.104250] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d43cd0 00:17:00.696 [2024-07-15 13:39:48.104342] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d43800 00:17:00.696 [2024-07-15 13:39:48.104349] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d43800 00:17:00.696 [2024-07-15 13:39:48.104413] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:00.696 pt4 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.696 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:00.954 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.954 "name": "raid_bdev1", 00:17:00.954 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:17:00.954 "strip_size_kb": 64, 00:17:00.954 "state": "online", 00:17:00.954 "raid_level": "concat", 00:17:00.954 "superblock": true, 00:17:00.954 "num_base_bdevs": 4, 00:17:00.954 "num_base_bdevs_discovered": 4, 00:17:00.954 "num_base_bdevs_operational": 4, 00:17:00.954 "base_bdevs_list": [ 00:17:00.954 { 00:17:00.954 "name": "pt1", 00:17:00.954 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:00.954 "is_configured": true, 00:17:00.954 "data_offset": 2048, 00:17:00.954 "data_size": 63488 00:17:00.954 }, 00:17:00.954 { 00:17:00.954 "name": "pt2", 00:17:00.954 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.954 "is_configured": true, 00:17:00.954 "data_offset": 2048, 00:17:00.954 "data_size": 63488 00:17:00.954 }, 00:17:00.954 { 00:17:00.954 "name": "pt3", 00:17:00.954 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.954 "is_configured": true, 00:17:00.954 "data_offset": 2048, 00:17:00.954 "data_size": 63488 00:17:00.954 }, 00:17:00.954 { 00:17:00.954 "name": "pt4", 00:17:00.954 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:00.954 "is_configured": true, 00:17:00.954 "data_offset": 2048, 00:17:00.954 "data_size": 63488 00:17:00.954 } 00:17:00.954 ] 00:17:00.954 }' 00:17:00.954 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.954 13:39:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.211 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:01.211 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:01.211 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:01.211 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:01.212 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:01.212 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:01.212 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:01.212 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.470 [2024-07-15 13:39:48.974176] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.470 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:01.470 "name": "raid_bdev1", 00:17:01.470 "aliases": [ 00:17:01.470 "6eb8433a-9eac-4763-88e3-5988ada403f7" 00:17:01.470 ], 00:17:01.470 "product_name": "Raid Volume", 00:17:01.470 "block_size": 512, 00:17:01.470 "num_blocks": 253952, 00:17:01.470 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:17:01.470 "assigned_rate_limits": { 00:17:01.470 "rw_ios_per_sec": 0, 00:17:01.470 "rw_mbytes_per_sec": 0, 00:17:01.470 "r_mbytes_per_sec": 0, 00:17:01.470 "w_mbytes_per_sec": 0 00:17:01.470 }, 00:17:01.470 "claimed": false, 00:17:01.470 "zoned": false, 00:17:01.470 "supported_io_types": { 00:17:01.470 "read": true, 00:17:01.470 "write": true, 00:17:01.470 "unmap": true, 00:17:01.470 "flush": true, 00:17:01.470 "reset": true, 00:17:01.470 "nvme_admin": false, 00:17:01.470 "nvme_io": false, 00:17:01.470 "nvme_io_md": false, 00:17:01.470 "write_zeroes": true, 00:17:01.470 "zcopy": false, 00:17:01.470 "get_zone_info": false, 00:17:01.470 "zone_management": false, 00:17:01.470 "zone_append": false, 00:17:01.470 "compare": false, 00:17:01.470 "compare_and_write": false, 00:17:01.470 "abort": false, 00:17:01.470 "seek_hole": false, 00:17:01.470 "seek_data": false, 00:17:01.470 "copy": false, 00:17:01.470 "nvme_iov_md": false 00:17:01.470 }, 00:17:01.470 "memory_domains": [ 00:17:01.470 { 00:17:01.470 "dma_device_id": "system", 00:17:01.470 "dma_device_type": 1 00:17:01.470 }, 00:17:01.470 { 00:17:01.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.470 "dma_device_type": 2 00:17:01.470 }, 00:17:01.470 { 00:17:01.470 "dma_device_id": "system", 00:17:01.470 "dma_device_type": 1 00:17:01.470 }, 00:17:01.470 { 00:17:01.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.470 "dma_device_type": 2 00:17:01.470 }, 00:17:01.470 { 00:17:01.470 "dma_device_id": "system", 00:17:01.470 "dma_device_type": 1 00:17:01.470 }, 00:17:01.470 { 00:17:01.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.470 "dma_device_type": 2 00:17:01.470 }, 00:17:01.470 { 00:17:01.470 "dma_device_id": "system", 00:17:01.470 "dma_device_type": 1 00:17:01.470 }, 00:17:01.470 { 00:17:01.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.471 "dma_device_type": 2 00:17:01.471 } 00:17:01.471 ], 00:17:01.471 "driver_specific": { 00:17:01.471 "raid": { 00:17:01.471 "uuid": "6eb8433a-9eac-4763-88e3-5988ada403f7", 00:17:01.471 "strip_size_kb": 64, 00:17:01.471 "state": "online", 00:17:01.471 "raid_level": "concat", 00:17:01.471 "superblock": true, 00:17:01.471 "num_base_bdevs": 4, 00:17:01.471 "num_base_bdevs_discovered": 4, 00:17:01.471 "num_base_bdevs_operational": 4, 00:17:01.471 "base_bdevs_list": [ 00:17:01.471 { 00:17:01.471 "name": "pt1", 00:17:01.471 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.471 "is_configured": true, 00:17:01.471 "data_offset": 2048, 00:17:01.471 "data_size": 63488 00:17:01.471 }, 00:17:01.471 { 00:17:01.471 "name": "pt2", 00:17:01.471 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:01.471 "is_configured": true, 00:17:01.471 "data_offset": 2048, 00:17:01.471 "data_size": 63488 00:17:01.471 }, 00:17:01.471 { 00:17:01.471 "name": "pt3", 00:17:01.471 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.471 "is_configured": true, 00:17:01.471 "data_offset": 2048, 00:17:01.471 "data_size": 63488 00:17:01.471 }, 00:17:01.471 { 00:17:01.471 "name": "pt4", 00:17:01.471 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:01.471 "is_configured": true, 00:17:01.471 "data_offset": 2048, 00:17:01.471 "data_size": 63488 00:17:01.471 } 00:17:01.471 ] 00:17:01.471 } 00:17:01.471 } 00:17:01.471 }' 00:17:01.471 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:01.471 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:01.471 pt2 00:17:01.471 pt3 00:17:01.471 pt4' 00:17:01.471 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.471 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:01.471 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:01.729 "name": "pt1", 00:17:01.729 "aliases": [ 00:17:01.729 "00000000-0000-0000-0000-000000000001" 00:17:01.729 ], 00:17:01.729 "product_name": "passthru", 00:17:01.729 "block_size": 512, 00:17:01.729 "num_blocks": 65536, 00:17:01.729 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.729 "assigned_rate_limits": { 00:17:01.729 "rw_ios_per_sec": 0, 00:17:01.729 "rw_mbytes_per_sec": 0, 00:17:01.729 "r_mbytes_per_sec": 0, 00:17:01.729 "w_mbytes_per_sec": 0 00:17:01.729 }, 00:17:01.729 "claimed": true, 00:17:01.729 "claim_type": "exclusive_write", 00:17:01.729 "zoned": false, 00:17:01.729 "supported_io_types": { 00:17:01.729 "read": true, 00:17:01.729 "write": true, 00:17:01.729 "unmap": true, 00:17:01.729 "flush": true, 00:17:01.729 "reset": true, 00:17:01.729 "nvme_admin": false, 00:17:01.729 "nvme_io": false, 00:17:01.729 "nvme_io_md": false, 00:17:01.729 "write_zeroes": true, 00:17:01.729 "zcopy": true, 00:17:01.729 "get_zone_info": false, 00:17:01.729 "zone_management": false, 00:17:01.729 "zone_append": false, 00:17:01.729 "compare": false, 00:17:01.729 "compare_and_write": false, 00:17:01.729 "abort": true, 00:17:01.729 "seek_hole": false, 00:17:01.729 "seek_data": false, 00:17:01.729 "copy": true, 00:17:01.729 "nvme_iov_md": false 00:17:01.729 }, 00:17:01.729 "memory_domains": [ 00:17:01.729 { 00:17:01.729 "dma_device_id": "system", 00:17:01.729 "dma_device_type": 1 00:17:01.729 }, 00:17:01.729 { 00:17:01.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.729 "dma_device_type": 2 00:17:01.729 } 00:17:01.729 ], 00:17:01.729 "driver_specific": { 00:17:01.729 "passthru": { 00:17:01.729 "name": "pt1", 00:17:01.729 "base_bdev_name": "malloc1" 00:17:01.729 } 00:17:01.729 } 00:17:01.729 }' 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.729 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:01.987 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.245 "name": "pt2", 00:17:02.245 "aliases": [ 00:17:02.245 "00000000-0000-0000-0000-000000000002" 00:17:02.245 ], 00:17:02.245 "product_name": "passthru", 00:17:02.245 "block_size": 512, 00:17:02.245 "num_blocks": 65536, 00:17:02.245 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.245 "assigned_rate_limits": { 00:17:02.245 "rw_ios_per_sec": 0, 00:17:02.245 "rw_mbytes_per_sec": 0, 00:17:02.245 "r_mbytes_per_sec": 0, 00:17:02.245 "w_mbytes_per_sec": 0 00:17:02.245 }, 00:17:02.245 "claimed": true, 00:17:02.245 "claim_type": "exclusive_write", 00:17:02.245 "zoned": false, 00:17:02.245 "supported_io_types": { 00:17:02.245 "read": true, 00:17:02.245 "write": true, 00:17:02.245 "unmap": true, 00:17:02.245 "flush": true, 00:17:02.245 "reset": true, 00:17:02.245 "nvme_admin": false, 00:17:02.245 "nvme_io": false, 00:17:02.245 "nvme_io_md": false, 00:17:02.245 "write_zeroes": true, 00:17:02.245 "zcopy": true, 00:17:02.245 "get_zone_info": false, 00:17:02.245 "zone_management": false, 00:17:02.245 "zone_append": false, 00:17:02.245 "compare": false, 00:17:02.245 "compare_and_write": false, 00:17:02.245 "abort": true, 00:17:02.245 "seek_hole": false, 00:17:02.245 "seek_data": false, 00:17:02.245 "copy": true, 00:17:02.245 "nvme_iov_md": false 00:17:02.245 }, 00:17:02.245 "memory_domains": [ 00:17:02.245 { 00:17:02.245 "dma_device_id": "system", 00:17:02.245 "dma_device_type": 1 00:17:02.245 }, 00:17:02.245 { 00:17:02.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.245 "dma_device_type": 2 00:17:02.245 } 00:17:02.245 ], 00:17:02.245 "driver_specific": { 00:17:02.245 "passthru": { 00:17:02.245 "name": "pt2", 00:17:02.245 "base_bdev_name": "malloc2" 00:17:02.245 } 00:17:02.245 } 00:17:02.245 }' 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.245 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.501 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.501 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.501 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.501 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:02.501 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.501 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.501 "name": "pt3", 00:17:02.501 "aliases": [ 00:17:02.501 "00000000-0000-0000-0000-000000000003" 00:17:02.501 ], 00:17:02.501 "product_name": "passthru", 00:17:02.501 "block_size": 512, 00:17:02.501 "num_blocks": 65536, 00:17:02.501 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:02.501 "assigned_rate_limits": { 00:17:02.501 "rw_ios_per_sec": 0, 00:17:02.501 "rw_mbytes_per_sec": 0, 00:17:02.501 "r_mbytes_per_sec": 0, 00:17:02.501 "w_mbytes_per_sec": 0 00:17:02.501 }, 00:17:02.501 "claimed": true, 00:17:02.501 "claim_type": "exclusive_write", 00:17:02.501 "zoned": false, 00:17:02.501 "supported_io_types": { 00:17:02.501 "read": true, 00:17:02.501 "write": true, 00:17:02.501 "unmap": true, 00:17:02.501 "flush": true, 00:17:02.501 "reset": true, 00:17:02.501 "nvme_admin": false, 00:17:02.501 "nvme_io": false, 00:17:02.501 "nvme_io_md": false, 00:17:02.501 "write_zeroes": true, 00:17:02.501 "zcopy": true, 00:17:02.501 "get_zone_info": false, 00:17:02.501 "zone_management": false, 00:17:02.501 "zone_append": false, 00:17:02.501 "compare": false, 00:17:02.501 "compare_and_write": false, 00:17:02.501 "abort": true, 00:17:02.501 "seek_hole": false, 00:17:02.501 "seek_data": false, 00:17:02.501 "copy": true, 00:17:02.501 "nvme_iov_md": false 00:17:02.501 }, 00:17:02.501 "memory_domains": [ 00:17:02.501 { 00:17:02.501 "dma_device_id": "system", 00:17:02.501 "dma_device_type": 1 00:17:02.501 }, 00:17:02.501 { 00:17:02.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.501 "dma_device_type": 2 00:17:02.501 } 00:17:02.501 ], 00:17:02.501 "driver_specific": { 00:17:02.501 "passthru": { 00:17:02.501 "name": "pt3", 00:17:02.501 "base_bdev_name": "malloc3" 00:17:02.501 } 00:17:02.501 } 00:17:02.501 }' 00:17:02.501 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.501 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.758 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.037 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.037 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.037 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:03.037 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.037 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.037 "name": "pt4", 00:17:03.037 "aliases": [ 00:17:03.037 "00000000-0000-0000-0000-000000000004" 00:17:03.037 ], 00:17:03.037 "product_name": "passthru", 00:17:03.037 "block_size": 512, 00:17:03.037 "num_blocks": 65536, 00:17:03.037 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:03.037 "assigned_rate_limits": { 00:17:03.037 "rw_ios_per_sec": 0, 00:17:03.037 "rw_mbytes_per_sec": 0, 00:17:03.037 "r_mbytes_per_sec": 0, 00:17:03.037 "w_mbytes_per_sec": 0 00:17:03.037 }, 00:17:03.037 "claimed": true, 00:17:03.037 "claim_type": "exclusive_write", 00:17:03.037 "zoned": false, 00:17:03.037 "supported_io_types": { 00:17:03.037 "read": true, 00:17:03.037 "write": true, 00:17:03.037 "unmap": true, 00:17:03.037 "flush": true, 00:17:03.037 "reset": true, 00:17:03.037 "nvme_admin": false, 00:17:03.037 "nvme_io": false, 00:17:03.037 "nvme_io_md": false, 00:17:03.037 "write_zeroes": true, 00:17:03.037 "zcopy": true, 00:17:03.037 "get_zone_info": false, 00:17:03.037 "zone_management": false, 00:17:03.037 "zone_append": false, 00:17:03.037 "compare": false, 00:17:03.037 "compare_and_write": false, 00:17:03.037 "abort": true, 00:17:03.037 "seek_hole": false, 00:17:03.037 "seek_data": false, 00:17:03.037 "copy": true, 00:17:03.037 "nvme_iov_md": false 00:17:03.037 }, 00:17:03.037 "memory_domains": [ 00:17:03.037 { 00:17:03.037 "dma_device_id": "system", 00:17:03.037 "dma_device_type": 1 00:17:03.037 }, 00:17:03.037 { 00:17:03.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.037 "dma_device_type": 2 00:17:03.037 } 00:17:03.037 ], 00:17:03.037 "driver_specific": { 00:17:03.037 "passthru": { 00:17:03.037 "name": "pt4", 00:17:03.037 "base_bdev_name": "malloc4" 00:17:03.037 } 00:17:03.037 } 00:17:03.037 }' 00:17:03.037 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:03.335 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:03.593 [2024-07-15 13:39:51.043563] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 6eb8433a-9eac-4763-88e3-5988ada403f7 '!=' 6eb8433a-9eac-4763-88e3-5988ada403f7 ']' 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 41408 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 41408 ']' 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 41408 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 41408 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 41408' 00:17:03.593 killing process with pid 41408 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 41408 00:17:03.593 [2024-07-15 13:39:51.095186] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:03.593 [2024-07-15 13:39:51.095244] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:03.593 [2024-07-15 13:39:51.095289] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:03.593 [2024-07-15 13:39:51.095297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d43800 name raid_bdev1, state offline 00:17:03.593 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 41408 00:17:03.593 [2024-07-15 13:39:51.130885] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:03.850 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:03.850 00:17:03.850 real 0m12.619s 00:17:03.850 user 0m22.512s 00:17:03.850 sys 0m2.481s 00:17:03.850 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:03.850 13:39:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.850 ************************************ 00:17:03.850 END TEST raid_superblock_test 00:17:03.850 ************************************ 00:17:03.850 13:39:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:03.850 13:39:51 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:17:03.850 13:39:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:03.850 13:39:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:03.850 13:39:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:03.850 ************************************ 00:17:03.850 START TEST raid_read_error_test 00:17:03.850 ************************************ 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.w4axahyNkF 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=43369 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 43369 /var/tmp/spdk-raid.sock 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 43369 ']' 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:03.850 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:03.850 13:39:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.109 [2024-07-15 13:39:51.470081] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:17:04.109 [2024-07-15 13:39:51.470133] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid43369 ] 00:17:04.109 [2024-07-15 13:39:51.555935] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:04.109 [2024-07-15 13:39:51.644199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:04.109 [2024-07-15 13:39:51.711128] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:04.109 [2024-07-15 13:39:51.711158] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:04.675 13:39:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:04.675 13:39:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:04.675 13:39:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:04.675 13:39:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:04.934 BaseBdev1_malloc 00:17:04.934 13:39:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:05.192 true 00:17:05.192 13:39:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:05.192 [2024-07-15 13:39:52.766323] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:05.192 [2024-07-15 13:39:52.766358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.192 [2024-07-15 13:39:52.766374] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a9990 00:17:05.192 [2024-07-15 13:39:52.766383] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.192 [2024-07-15 13:39:52.767730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.192 [2024-07-15 13:39:52.767753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:05.192 BaseBdev1 00:17:05.192 13:39:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:05.192 13:39:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:05.449 BaseBdev2_malloc 00:17:05.449 13:39:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:05.708 true 00:17:05.708 13:39:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:05.708 [2024-07-15 13:39:53.271246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:05.708 [2024-07-15 13:39:53.271280] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.708 [2024-07-15 13:39:53.271312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19ae1d0 00:17:05.708 [2024-07-15 13:39:53.271320] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.708 [2024-07-15 13:39:53.272499] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.708 [2024-07-15 13:39:53.272522] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:05.708 BaseBdev2 00:17:05.708 13:39:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:05.708 13:39:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:05.966 BaseBdev3_malloc 00:17:05.966 13:39:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:06.224 true 00:17:06.224 13:39:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:06.224 [2024-07-15 13:39:53.809529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:06.224 [2024-07-15 13:39:53.809563] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.224 [2024-07-15 13:39:53.809595] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b0490 00:17:06.224 [2024-07-15 13:39:53.809603] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.224 [2024-07-15 13:39:53.810770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.224 [2024-07-15 13:39:53.810793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:06.224 BaseBdev3 00:17:06.224 13:39:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:06.224 13:39:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:06.481 BaseBdev4_malloc 00:17:06.481 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:06.739 true 00:17:06.739 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:06.739 [2024-07-15 13:39:54.335708] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:06.739 [2024-07-15 13:39:54.335742] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.739 [2024-07-15 13:39:54.335758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b1360 00:17:06.739 [2024-07-15 13:39:54.335766] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.739 [2024-07-15 13:39:54.336897] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.739 [2024-07-15 13:39:54.336920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:06.739 BaseBdev4 00:17:06.740 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:06.998 [2024-07-15 13:39:54.504176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:06.998 [2024-07-15 13:39:54.505154] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:06.998 [2024-07-15 13:39:54.505202] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:06.998 [2024-07-15 13:39:54.505244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:06.998 [2024-07-15 13:39:54.505412] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19ab4e0 00:17:06.998 [2024-07-15 13:39:54.505420] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:06.998 [2024-07-15 13:39:54.505562] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17ffb20 00:17:06.998 [2024-07-15 13:39:54.505667] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19ab4e0 00:17:06.998 [2024-07-15 13:39:54.505674] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19ab4e0 00:17:06.998 [2024-07-15 13:39:54.505746] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.998 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:07.256 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.256 "name": "raid_bdev1", 00:17:07.256 "uuid": "50ac66db-ac19-4522-b30f-e244df60b66c", 00:17:07.256 "strip_size_kb": 64, 00:17:07.256 "state": "online", 00:17:07.256 "raid_level": "concat", 00:17:07.256 "superblock": true, 00:17:07.256 "num_base_bdevs": 4, 00:17:07.256 "num_base_bdevs_discovered": 4, 00:17:07.256 "num_base_bdevs_operational": 4, 00:17:07.256 "base_bdevs_list": [ 00:17:07.256 { 00:17:07.256 "name": "BaseBdev1", 00:17:07.256 "uuid": "54ba2d76-cdb3-577d-b7e9-e449ef788c87", 00:17:07.256 "is_configured": true, 00:17:07.256 "data_offset": 2048, 00:17:07.256 "data_size": 63488 00:17:07.256 }, 00:17:07.256 { 00:17:07.256 "name": "BaseBdev2", 00:17:07.256 "uuid": "138e3f18-f518-5017-9a01-549d2f1b3d97", 00:17:07.256 "is_configured": true, 00:17:07.256 "data_offset": 2048, 00:17:07.256 "data_size": 63488 00:17:07.256 }, 00:17:07.256 { 00:17:07.256 "name": "BaseBdev3", 00:17:07.256 "uuid": "774e435a-3b5a-5029-b081-ba76d267e379", 00:17:07.256 "is_configured": true, 00:17:07.256 "data_offset": 2048, 00:17:07.256 "data_size": 63488 00:17:07.256 }, 00:17:07.256 { 00:17:07.256 "name": "BaseBdev4", 00:17:07.256 "uuid": "25b4bbc8-aae7-5d55-9a83-228efb691a25", 00:17:07.256 "is_configured": true, 00:17:07.256 "data_offset": 2048, 00:17:07.256 "data_size": 63488 00:17:07.256 } 00:17:07.256 ] 00:17:07.256 }' 00:17:07.256 13:39:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.256 13:39:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.822 13:39:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:07.822 13:39:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:07.822 [2024-07-15 13:39:55.246288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x199d880 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.817 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:09.075 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.075 "name": "raid_bdev1", 00:17:09.075 "uuid": "50ac66db-ac19-4522-b30f-e244df60b66c", 00:17:09.075 "strip_size_kb": 64, 00:17:09.075 "state": "online", 00:17:09.075 "raid_level": "concat", 00:17:09.075 "superblock": true, 00:17:09.075 "num_base_bdevs": 4, 00:17:09.075 "num_base_bdevs_discovered": 4, 00:17:09.075 "num_base_bdevs_operational": 4, 00:17:09.075 "base_bdevs_list": [ 00:17:09.075 { 00:17:09.075 "name": "BaseBdev1", 00:17:09.075 "uuid": "54ba2d76-cdb3-577d-b7e9-e449ef788c87", 00:17:09.075 "is_configured": true, 00:17:09.075 "data_offset": 2048, 00:17:09.075 "data_size": 63488 00:17:09.075 }, 00:17:09.075 { 00:17:09.075 "name": "BaseBdev2", 00:17:09.075 "uuid": "138e3f18-f518-5017-9a01-549d2f1b3d97", 00:17:09.075 "is_configured": true, 00:17:09.075 "data_offset": 2048, 00:17:09.075 "data_size": 63488 00:17:09.075 }, 00:17:09.075 { 00:17:09.075 "name": "BaseBdev3", 00:17:09.075 "uuid": "774e435a-3b5a-5029-b081-ba76d267e379", 00:17:09.075 "is_configured": true, 00:17:09.075 "data_offset": 2048, 00:17:09.075 "data_size": 63488 00:17:09.075 }, 00:17:09.075 { 00:17:09.075 "name": "BaseBdev4", 00:17:09.075 "uuid": "25b4bbc8-aae7-5d55-9a83-228efb691a25", 00:17:09.075 "is_configured": true, 00:17:09.075 "data_offset": 2048, 00:17:09.075 "data_size": 63488 00:17:09.075 } 00:17:09.075 ] 00:17:09.075 }' 00:17:09.075 13:39:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.075 13:39:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.641 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:09.641 [2024-07-15 13:39:57.211628] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:09.641 [2024-07-15 13:39:57.211655] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:09.641 [2024-07-15 13:39:57.213819] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:09.641 [2024-07-15 13:39:57.213847] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:09.641 [2024-07-15 13:39:57.213876] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:09.641 [2024-07-15 13:39:57.213883] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19ab4e0 name raid_bdev1, state offline 00:17:09.641 0 00:17:09.641 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 43369 00:17:09.641 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 43369 ']' 00:17:09.641 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 43369 00:17:09.641 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:09.641 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:09.641 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 43369 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 43369' 00:17:09.900 killing process with pid 43369 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 43369 00:17:09.900 [2024-07-15 13:39:57.277529] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 43369 00:17:09.900 [2024-07-15 13:39:57.307719] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.w4axahyNkF 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:09.900 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:10.160 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:17:10.160 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:17:10.160 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:10.160 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:10.160 13:39:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:17:10.160 00:17:10.160 real 0m6.124s 00:17:10.160 user 0m9.394s 00:17:10.160 sys 0m1.135s 00:17:10.160 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:10.160 13:39:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.160 ************************************ 00:17:10.160 END TEST raid_read_error_test 00:17:10.160 ************************************ 00:17:10.160 13:39:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:10.160 13:39:57 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:17:10.160 13:39:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:10.160 13:39:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:10.160 13:39:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:10.160 ************************************ 00:17:10.160 START TEST raid_write_error_test 00:17:10.160 ************************************ 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.rHPhVyUtNY 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=44291 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 44291 /var/tmp/spdk-raid.sock 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 44291 ']' 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:10.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:10.160 13:39:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.160 [2024-07-15 13:39:57.640383] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:17:10.160 [2024-07-15 13:39:57.640437] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid44291 ] 00:17:10.160 [2024-07-15 13:39:57.725039] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:10.419 [2024-07-15 13:39:57.813064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:10.419 [2024-07-15 13:39:57.863829] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.419 [2024-07-15 13:39:57.863860] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.985 13:39:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:10.985 13:39:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:10.985 13:39:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:10.985 13:39:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:11.243 BaseBdev1_malloc 00:17:11.243 13:39:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:11.243 true 00:17:11.243 13:39:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:11.500 [2024-07-15 13:39:58.951528] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:11.500 [2024-07-15 13:39:58.951565] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.500 [2024-07-15 13:39:58.951596] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1204990 00:17:11.500 [2024-07-15 13:39:58.951605] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.500 [2024-07-15 13:39:58.953021] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.500 [2024-07-15 13:39:58.953045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:11.500 BaseBdev1 00:17:11.500 13:39:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:11.500 13:39:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:11.758 BaseBdev2_malloc 00:17:11.758 13:39:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:11.758 true 00:17:11.758 13:39:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:12.015 [2024-07-15 13:39:59.485902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:12.015 [2024-07-15 13:39:59.485938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.015 [2024-07-15 13:39:59.485968] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12091d0 00:17:12.015 [2024-07-15 13:39:59.485977] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.015 [2024-07-15 13:39:59.487057] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.015 [2024-07-15 13:39:59.487079] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:12.015 BaseBdev2 00:17:12.015 13:39:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:12.015 13:39:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:12.273 BaseBdev3_malloc 00:17:12.273 13:39:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:12.273 true 00:17:12.273 13:39:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:12.531 [2024-07-15 13:40:00.010931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:12.531 [2024-07-15 13:40:00.010969] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.531 [2024-07-15 13:40:00.010986] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x120b490 00:17:12.531 [2024-07-15 13:40:00.011002] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.531 [2024-07-15 13:40:00.012277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.531 [2024-07-15 13:40:00.012300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:12.531 BaseBdev3 00:17:12.531 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:12.531 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:12.790 BaseBdev4_malloc 00:17:12.790 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:12.790 true 00:17:12.790 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:13.049 [2024-07-15 13:40:00.545256] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:13.049 [2024-07-15 13:40:00.545293] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.049 [2024-07-15 13:40:00.545309] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x120c360 00:17:13.049 [2024-07-15 13:40:00.545317] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.049 [2024-07-15 13:40:00.546507] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.049 [2024-07-15 13:40:00.546530] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:13.049 BaseBdev4 00:17:13.049 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:13.308 [2024-07-15 13:40:00.713718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:13.308 [2024-07-15 13:40:00.714700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:13.308 [2024-07-15 13:40:00.714746] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:13.308 [2024-07-15 13:40:00.714785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:13.308 [2024-07-15 13:40:00.714940] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12064e0 00:17:13.308 [2024-07-15 13:40:00.714947] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:13.308 [2024-07-15 13:40:00.715093] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x105ab20 00:17:13.308 [2024-07-15 13:40:00.715198] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12064e0 00:17:13.308 [2024-07-15 13:40:00.715205] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12064e0 00:17:13.308 [2024-07-15 13:40:00.715275] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.308 "name": "raid_bdev1", 00:17:13.308 "uuid": "e0534b3b-6bb9-48e1-b25a-ec04d913e934", 00:17:13.308 "strip_size_kb": 64, 00:17:13.308 "state": "online", 00:17:13.308 "raid_level": "concat", 00:17:13.308 "superblock": true, 00:17:13.308 "num_base_bdevs": 4, 00:17:13.308 "num_base_bdevs_discovered": 4, 00:17:13.308 "num_base_bdevs_operational": 4, 00:17:13.308 "base_bdevs_list": [ 00:17:13.308 { 00:17:13.308 "name": "BaseBdev1", 00:17:13.308 "uuid": "0cbcfc33-d42b-5919-8f0e-fdd6ae90afcf", 00:17:13.308 "is_configured": true, 00:17:13.308 "data_offset": 2048, 00:17:13.308 "data_size": 63488 00:17:13.308 }, 00:17:13.308 { 00:17:13.308 "name": "BaseBdev2", 00:17:13.308 "uuid": "acb65818-32d1-5678-a433-9f02ff726208", 00:17:13.308 "is_configured": true, 00:17:13.308 "data_offset": 2048, 00:17:13.308 "data_size": 63488 00:17:13.308 }, 00:17:13.308 { 00:17:13.308 "name": "BaseBdev3", 00:17:13.308 "uuid": "be5b7f0b-7413-5d3c-a361-391ca1e3fe03", 00:17:13.308 "is_configured": true, 00:17:13.308 "data_offset": 2048, 00:17:13.308 "data_size": 63488 00:17:13.308 }, 00:17:13.308 { 00:17:13.308 "name": "BaseBdev4", 00:17:13.308 "uuid": "c60f5a2e-2482-5965-946b-8677eb190ec7", 00:17:13.308 "is_configured": true, 00:17:13.308 "data_offset": 2048, 00:17:13.308 "data_size": 63488 00:17:13.308 } 00:17:13.308 ] 00:17:13.308 }' 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.308 13:40:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.874 13:40:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:13.874 13:40:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:13.874 [2024-07-15 13:40:01.459852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f8880 00:17:14.809 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.067 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.325 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.325 "name": "raid_bdev1", 00:17:15.325 "uuid": "e0534b3b-6bb9-48e1-b25a-ec04d913e934", 00:17:15.325 "strip_size_kb": 64, 00:17:15.325 "state": "online", 00:17:15.325 "raid_level": "concat", 00:17:15.325 "superblock": true, 00:17:15.325 "num_base_bdevs": 4, 00:17:15.325 "num_base_bdevs_discovered": 4, 00:17:15.325 "num_base_bdevs_operational": 4, 00:17:15.325 "base_bdevs_list": [ 00:17:15.325 { 00:17:15.325 "name": "BaseBdev1", 00:17:15.325 "uuid": "0cbcfc33-d42b-5919-8f0e-fdd6ae90afcf", 00:17:15.325 "is_configured": true, 00:17:15.325 "data_offset": 2048, 00:17:15.325 "data_size": 63488 00:17:15.325 }, 00:17:15.325 { 00:17:15.325 "name": "BaseBdev2", 00:17:15.325 "uuid": "acb65818-32d1-5678-a433-9f02ff726208", 00:17:15.325 "is_configured": true, 00:17:15.325 "data_offset": 2048, 00:17:15.325 "data_size": 63488 00:17:15.325 }, 00:17:15.325 { 00:17:15.325 "name": "BaseBdev3", 00:17:15.325 "uuid": "be5b7f0b-7413-5d3c-a361-391ca1e3fe03", 00:17:15.325 "is_configured": true, 00:17:15.325 "data_offset": 2048, 00:17:15.325 "data_size": 63488 00:17:15.325 }, 00:17:15.325 { 00:17:15.325 "name": "BaseBdev4", 00:17:15.325 "uuid": "c60f5a2e-2482-5965-946b-8677eb190ec7", 00:17:15.325 "is_configured": true, 00:17:15.325 "data_offset": 2048, 00:17:15.325 "data_size": 63488 00:17:15.325 } 00:17:15.325 ] 00:17:15.325 }' 00:17:15.325 13:40:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.325 13:40:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:15.889 [2024-07-15 13:40:03.397311] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:15.889 [2024-07-15 13:40:03.397349] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:15.889 [2024-07-15 13:40:03.399395] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:15.889 [2024-07-15 13:40:03.399422] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:15.889 [2024-07-15 13:40:03.399450] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:15.889 [2024-07-15 13:40:03.399458] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12064e0 name raid_bdev1, state offline 00:17:15.889 0 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 44291 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 44291 ']' 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 44291 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 44291 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 44291' 00:17:15.889 killing process with pid 44291 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 44291 00:17:15.889 [2024-07-15 13:40:03.463444] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:15.889 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 44291 00:17:15.889 [2024-07-15 13:40:03.493333] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.rHPhVyUtNY 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:17:16.146 00:17:16.146 real 0m6.113s 00:17:16.146 user 0m9.480s 00:17:16.146 sys 0m1.070s 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:16.146 13:40:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.146 ************************************ 00:17:16.146 END TEST raid_write_error_test 00:17:16.146 ************************************ 00:17:16.146 13:40:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:16.146 13:40:03 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:16.146 13:40:03 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:17:16.146 13:40:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:16.146 13:40:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:16.146 13:40:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:16.404 ************************************ 00:17:16.404 START TEST raid_state_function_test 00:17:16.404 ************************************ 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:16.404 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=45156 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 45156' 00:17:16.405 Process raid pid: 45156 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 45156 /var/tmp/spdk-raid.sock 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 45156 ']' 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:16.405 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:16.405 13:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.405 [2024-07-15 13:40:03.812192] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:17:16.405 [2024-07-15 13:40:03.812237] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:16.405 [2024-07-15 13:40:03.901407] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:16.405 [2024-07-15 13:40:03.989716] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:16.662 [2024-07-15 13:40:04.043255] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:16.662 [2024-07-15 13:40:04.043279] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:17.228 13:40:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:17.228 13:40:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:17.228 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:17.228 [2024-07-15 13:40:04.780107] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:17.228 [2024-07-15 13:40:04.780143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:17.228 [2024-07-15 13:40:04.780151] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:17.228 [2024-07-15 13:40:04.780174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:17.228 [2024-07-15 13:40:04.780180] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:17.228 [2024-07-15 13:40:04.780188] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:17.228 [2024-07-15 13:40:04.780193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:17.228 [2024-07-15 13:40:04.780200] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.229 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.488 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.488 "name": "Existed_Raid", 00:17:17.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.488 "strip_size_kb": 0, 00:17:17.488 "state": "configuring", 00:17:17.488 "raid_level": "raid1", 00:17:17.488 "superblock": false, 00:17:17.488 "num_base_bdevs": 4, 00:17:17.488 "num_base_bdevs_discovered": 0, 00:17:17.488 "num_base_bdevs_operational": 4, 00:17:17.488 "base_bdevs_list": [ 00:17:17.488 { 00:17:17.488 "name": "BaseBdev1", 00:17:17.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.488 "is_configured": false, 00:17:17.488 "data_offset": 0, 00:17:17.488 "data_size": 0 00:17:17.488 }, 00:17:17.488 { 00:17:17.488 "name": "BaseBdev2", 00:17:17.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.489 "is_configured": false, 00:17:17.489 "data_offset": 0, 00:17:17.489 "data_size": 0 00:17:17.489 }, 00:17:17.489 { 00:17:17.489 "name": "BaseBdev3", 00:17:17.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.489 "is_configured": false, 00:17:17.489 "data_offset": 0, 00:17:17.489 "data_size": 0 00:17:17.489 }, 00:17:17.489 { 00:17:17.489 "name": "BaseBdev4", 00:17:17.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.489 "is_configured": false, 00:17:17.489 "data_offset": 0, 00:17:17.489 "data_size": 0 00:17:17.489 } 00:17:17.489 ] 00:17:17.489 }' 00:17:17.489 13:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.489 13:40:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:18.057 13:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:18.057 [2024-07-15 13:40:05.622181] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:18.057 [2024-07-15 13:40:05.622206] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2340f70 name Existed_Raid, state configuring 00:17:18.057 13:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:18.316 [2024-07-15 13:40:05.806679] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:18.316 [2024-07-15 13:40:05.806707] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:18.316 [2024-07-15 13:40:05.806713] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:18.316 [2024-07-15 13:40:05.806721] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:18.316 [2024-07-15 13:40:05.806743] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:18.316 [2024-07-15 13:40:05.806751] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:18.316 [2024-07-15 13:40:05.806756] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:18.316 [2024-07-15 13:40:05.806763] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:18.316 13:40:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:18.575 [2024-07-15 13:40:05.999877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:18.575 BaseBdev1 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:18.575 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:18.834 [ 00:17:18.834 { 00:17:18.834 "name": "BaseBdev1", 00:17:18.834 "aliases": [ 00:17:18.834 "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1" 00:17:18.834 ], 00:17:18.834 "product_name": "Malloc disk", 00:17:18.834 "block_size": 512, 00:17:18.834 "num_blocks": 65536, 00:17:18.834 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:18.834 "assigned_rate_limits": { 00:17:18.834 "rw_ios_per_sec": 0, 00:17:18.834 "rw_mbytes_per_sec": 0, 00:17:18.834 "r_mbytes_per_sec": 0, 00:17:18.834 "w_mbytes_per_sec": 0 00:17:18.834 }, 00:17:18.834 "claimed": true, 00:17:18.834 "claim_type": "exclusive_write", 00:17:18.834 "zoned": false, 00:17:18.834 "supported_io_types": { 00:17:18.834 "read": true, 00:17:18.834 "write": true, 00:17:18.834 "unmap": true, 00:17:18.834 "flush": true, 00:17:18.834 "reset": true, 00:17:18.834 "nvme_admin": false, 00:17:18.834 "nvme_io": false, 00:17:18.834 "nvme_io_md": false, 00:17:18.834 "write_zeroes": true, 00:17:18.834 "zcopy": true, 00:17:18.834 "get_zone_info": false, 00:17:18.834 "zone_management": false, 00:17:18.834 "zone_append": false, 00:17:18.834 "compare": false, 00:17:18.834 "compare_and_write": false, 00:17:18.834 "abort": true, 00:17:18.834 "seek_hole": false, 00:17:18.834 "seek_data": false, 00:17:18.834 "copy": true, 00:17:18.834 "nvme_iov_md": false 00:17:18.834 }, 00:17:18.834 "memory_domains": [ 00:17:18.834 { 00:17:18.834 "dma_device_id": "system", 00:17:18.835 "dma_device_type": 1 00:17:18.835 }, 00:17:18.835 { 00:17:18.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.835 "dma_device_type": 2 00:17:18.835 } 00:17:18.835 ], 00:17:18.835 "driver_specific": {} 00:17:18.835 } 00:17:18.835 ] 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.835 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:19.094 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.094 "name": "Existed_Raid", 00:17:19.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.094 "strip_size_kb": 0, 00:17:19.094 "state": "configuring", 00:17:19.094 "raid_level": "raid1", 00:17:19.094 "superblock": false, 00:17:19.094 "num_base_bdevs": 4, 00:17:19.094 "num_base_bdevs_discovered": 1, 00:17:19.094 "num_base_bdevs_operational": 4, 00:17:19.094 "base_bdevs_list": [ 00:17:19.094 { 00:17:19.094 "name": "BaseBdev1", 00:17:19.094 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:19.094 "is_configured": true, 00:17:19.094 "data_offset": 0, 00:17:19.094 "data_size": 65536 00:17:19.094 }, 00:17:19.094 { 00:17:19.094 "name": "BaseBdev2", 00:17:19.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.094 "is_configured": false, 00:17:19.094 "data_offset": 0, 00:17:19.094 "data_size": 0 00:17:19.094 }, 00:17:19.094 { 00:17:19.094 "name": "BaseBdev3", 00:17:19.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.094 "is_configured": false, 00:17:19.094 "data_offset": 0, 00:17:19.094 "data_size": 0 00:17:19.094 }, 00:17:19.094 { 00:17:19.094 "name": "BaseBdev4", 00:17:19.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.094 "is_configured": false, 00:17:19.094 "data_offset": 0, 00:17:19.094 "data_size": 0 00:17:19.094 } 00:17:19.094 ] 00:17:19.094 }' 00:17:19.094 13:40:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.094 13:40:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.660 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:19.660 [2024-07-15 13:40:07.178912] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:19.660 [2024-07-15 13:40:07.178944] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23407e0 name Existed_Raid, state configuring 00:17:19.660 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:19.919 [2024-07-15 13:40:07.355394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:19.919 [2024-07-15 13:40:07.356427] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:19.919 [2024-07-15 13:40:07.356454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:19.919 [2024-07-15 13:40:07.356461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:19.919 [2024-07-15 13:40:07.356468] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:19.919 [2024-07-15 13:40:07.356491] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:19.919 [2024-07-15 13:40:07.356498] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.919 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.179 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.179 "name": "Existed_Raid", 00:17:20.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.179 "strip_size_kb": 0, 00:17:20.179 "state": "configuring", 00:17:20.179 "raid_level": "raid1", 00:17:20.179 "superblock": false, 00:17:20.179 "num_base_bdevs": 4, 00:17:20.179 "num_base_bdevs_discovered": 1, 00:17:20.179 "num_base_bdevs_operational": 4, 00:17:20.179 "base_bdevs_list": [ 00:17:20.179 { 00:17:20.179 "name": "BaseBdev1", 00:17:20.179 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:20.179 "is_configured": true, 00:17:20.179 "data_offset": 0, 00:17:20.179 "data_size": 65536 00:17:20.179 }, 00:17:20.179 { 00:17:20.179 "name": "BaseBdev2", 00:17:20.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.179 "is_configured": false, 00:17:20.179 "data_offset": 0, 00:17:20.179 "data_size": 0 00:17:20.179 }, 00:17:20.179 { 00:17:20.179 "name": "BaseBdev3", 00:17:20.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.179 "is_configured": false, 00:17:20.179 "data_offset": 0, 00:17:20.179 "data_size": 0 00:17:20.179 }, 00:17:20.179 { 00:17:20.179 "name": "BaseBdev4", 00:17:20.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.179 "is_configured": false, 00:17:20.179 "data_offset": 0, 00:17:20.179 "data_size": 0 00:17:20.179 } 00:17:20.179 ] 00:17:20.179 }' 00:17:20.179 13:40:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.179 13:40:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.438 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:20.697 [2024-07-15 13:40:08.172394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:20.697 BaseBdev2 00:17:20.697 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:20.697 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:20.697 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:20.697 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:20.697 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:20.697 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:20.697 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.956 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:20.956 [ 00:17:20.956 { 00:17:20.956 "name": "BaseBdev2", 00:17:20.956 "aliases": [ 00:17:20.956 "ee3fa4b4-f9a4-4571-97aa-28a2c6635926" 00:17:20.956 ], 00:17:20.956 "product_name": "Malloc disk", 00:17:20.956 "block_size": 512, 00:17:20.956 "num_blocks": 65536, 00:17:20.956 "uuid": "ee3fa4b4-f9a4-4571-97aa-28a2c6635926", 00:17:20.956 "assigned_rate_limits": { 00:17:20.956 "rw_ios_per_sec": 0, 00:17:20.956 "rw_mbytes_per_sec": 0, 00:17:20.956 "r_mbytes_per_sec": 0, 00:17:20.956 "w_mbytes_per_sec": 0 00:17:20.956 }, 00:17:20.956 "claimed": true, 00:17:20.956 "claim_type": "exclusive_write", 00:17:20.956 "zoned": false, 00:17:20.956 "supported_io_types": { 00:17:20.956 "read": true, 00:17:20.956 "write": true, 00:17:20.956 "unmap": true, 00:17:20.956 "flush": true, 00:17:20.956 "reset": true, 00:17:20.956 "nvme_admin": false, 00:17:20.956 "nvme_io": false, 00:17:20.956 "nvme_io_md": false, 00:17:20.956 "write_zeroes": true, 00:17:20.956 "zcopy": true, 00:17:20.956 "get_zone_info": false, 00:17:20.956 "zone_management": false, 00:17:20.956 "zone_append": false, 00:17:20.956 "compare": false, 00:17:20.956 "compare_and_write": false, 00:17:20.956 "abort": true, 00:17:20.956 "seek_hole": false, 00:17:20.956 "seek_data": false, 00:17:20.956 "copy": true, 00:17:20.956 "nvme_iov_md": false 00:17:20.956 }, 00:17:20.956 "memory_domains": [ 00:17:20.956 { 00:17:20.956 "dma_device_id": "system", 00:17:20.956 "dma_device_type": 1 00:17:20.956 }, 00:17:20.956 { 00:17:20.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.956 "dma_device_type": 2 00:17:20.956 } 00:17:20.956 ], 00:17:20.956 "driver_specific": {} 00:17:20.956 } 00:17:20.956 ] 00:17:20.956 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:20.956 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.957 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.213 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.213 "name": "Existed_Raid", 00:17:21.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.213 "strip_size_kb": 0, 00:17:21.213 "state": "configuring", 00:17:21.213 "raid_level": "raid1", 00:17:21.213 "superblock": false, 00:17:21.213 "num_base_bdevs": 4, 00:17:21.213 "num_base_bdevs_discovered": 2, 00:17:21.213 "num_base_bdevs_operational": 4, 00:17:21.213 "base_bdevs_list": [ 00:17:21.213 { 00:17:21.213 "name": "BaseBdev1", 00:17:21.213 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:21.213 "is_configured": true, 00:17:21.213 "data_offset": 0, 00:17:21.213 "data_size": 65536 00:17:21.213 }, 00:17:21.213 { 00:17:21.213 "name": "BaseBdev2", 00:17:21.213 "uuid": "ee3fa4b4-f9a4-4571-97aa-28a2c6635926", 00:17:21.213 "is_configured": true, 00:17:21.213 "data_offset": 0, 00:17:21.213 "data_size": 65536 00:17:21.213 }, 00:17:21.213 { 00:17:21.213 "name": "BaseBdev3", 00:17:21.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.214 "is_configured": false, 00:17:21.214 "data_offset": 0, 00:17:21.214 "data_size": 0 00:17:21.214 }, 00:17:21.214 { 00:17:21.214 "name": "BaseBdev4", 00:17:21.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.214 "is_configured": false, 00:17:21.214 "data_offset": 0, 00:17:21.214 "data_size": 0 00:17:21.214 } 00:17:21.214 ] 00:17:21.214 }' 00:17:21.214 13:40:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.214 13:40:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.780 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:21.780 [2024-07-15 13:40:09.386423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:21.780 BaseBdev3 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.039 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:22.297 [ 00:17:22.298 { 00:17:22.298 "name": "BaseBdev3", 00:17:22.298 "aliases": [ 00:17:22.298 "194b2d93-999e-45b2-86f3-cf0efc0ea0ff" 00:17:22.298 ], 00:17:22.298 "product_name": "Malloc disk", 00:17:22.298 "block_size": 512, 00:17:22.298 "num_blocks": 65536, 00:17:22.298 "uuid": "194b2d93-999e-45b2-86f3-cf0efc0ea0ff", 00:17:22.298 "assigned_rate_limits": { 00:17:22.298 "rw_ios_per_sec": 0, 00:17:22.298 "rw_mbytes_per_sec": 0, 00:17:22.298 "r_mbytes_per_sec": 0, 00:17:22.298 "w_mbytes_per_sec": 0 00:17:22.298 }, 00:17:22.298 "claimed": true, 00:17:22.298 "claim_type": "exclusive_write", 00:17:22.298 "zoned": false, 00:17:22.298 "supported_io_types": { 00:17:22.298 "read": true, 00:17:22.298 "write": true, 00:17:22.298 "unmap": true, 00:17:22.298 "flush": true, 00:17:22.298 "reset": true, 00:17:22.298 "nvme_admin": false, 00:17:22.298 "nvme_io": false, 00:17:22.298 "nvme_io_md": false, 00:17:22.298 "write_zeroes": true, 00:17:22.298 "zcopy": true, 00:17:22.298 "get_zone_info": false, 00:17:22.298 "zone_management": false, 00:17:22.298 "zone_append": false, 00:17:22.298 "compare": false, 00:17:22.298 "compare_and_write": false, 00:17:22.298 "abort": true, 00:17:22.298 "seek_hole": false, 00:17:22.298 "seek_data": false, 00:17:22.298 "copy": true, 00:17:22.298 "nvme_iov_md": false 00:17:22.298 }, 00:17:22.298 "memory_domains": [ 00:17:22.298 { 00:17:22.298 "dma_device_id": "system", 00:17:22.298 "dma_device_type": 1 00:17:22.298 }, 00:17:22.298 { 00:17:22.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.298 "dma_device_type": 2 00:17:22.298 } 00:17:22.298 ], 00:17:22.298 "driver_specific": {} 00:17:22.298 } 00:17:22.298 ] 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.298 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.557 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.557 "name": "Existed_Raid", 00:17:22.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.557 "strip_size_kb": 0, 00:17:22.557 "state": "configuring", 00:17:22.557 "raid_level": "raid1", 00:17:22.557 "superblock": false, 00:17:22.557 "num_base_bdevs": 4, 00:17:22.557 "num_base_bdevs_discovered": 3, 00:17:22.557 "num_base_bdevs_operational": 4, 00:17:22.557 "base_bdevs_list": [ 00:17:22.557 { 00:17:22.557 "name": "BaseBdev1", 00:17:22.557 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:22.557 "is_configured": true, 00:17:22.557 "data_offset": 0, 00:17:22.557 "data_size": 65536 00:17:22.557 }, 00:17:22.557 { 00:17:22.557 "name": "BaseBdev2", 00:17:22.557 "uuid": "ee3fa4b4-f9a4-4571-97aa-28a2c6635926", 00:17:22.557 "is_configured": true, 00:17:22.557 "data_offset": 0, 00:17:22.557 "data_size": 65536 00:17:22.557 }, 00:17:22.557 { 00:17:22.557 "name": "BaseBdev3", 00:17:22.557 "uuid": "194b2d93-999e-45b2-86f3-cf0efc0ea0ff", 00:17:22.557 "is_configured": true, 00:17:22.557 "data_offset": 0, 00:17:22.557 "data_size": 65536 00:17:22.557 }, 00:17:22.557 { 00:17:22.557 "name": "BaseBdev4", 00:17:22.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.557 "is_configured": false, 00:17:22.557 "data_offset": 0, 00:17:22.557 "data_size": 0 00:17:22.557 } 00:17:22.557 ] 00:17:22.557 }' 00:17:22.557 13:40:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.557 13:40:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.125 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:23.125 [2024-07-15 13:40:10.600604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:23.125 [2024-07-15 13:40:10.600637] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2341840 00:17:23.125 [2024-07-15 13:40:10.600649] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:23.125 [2024-07-15 13:40:10.600801] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2341480 00:17:23.125 [2024-07-15 13:40:10.600891] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2341840 00:17:23.125 [2024-07-15 13:40:10.600897] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2341840 00:17:23.125 [2024-07-15 13:40:10.601022] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:23.125 BaseBdev4 00:17:23.125 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:23.126 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:23.126 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:23.126 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:23.126 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:23.126 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:23.126 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:23.385 [ 00:17:23.385 { 00:17:23.385 "name": "BaseBdev4", 00:17:23.385 "aliases": [ 00:17:23.385 "f83609a3-9e19-42e6-8922-21afc3dd01a2" 00:17:23.385 ], 00:17:23.385 "product_name": "Malloc disk", 00:17:23.385 "block_size": 512, 00:17:23.385 "num_blocks": 65536, 00:17:23.385 "uuid": "f83609a3-9e19-42e6-8922-21afc3dd01a2", 00:17:23.385 "assigned_rate_limits": { 00:17:23.385 "rw_ios_per_sec": 0, 00:17:23.385 "rw_mbytes_per_sec": 0, 00:17:23.385 "r_mbytes_per_sec": 0, 00:17:23.385 "w_mbytes_per_sec": 0 00:17:23.385 }, 00:17:23.385 "claimed": true, 00:17:23.385 "claim_type": "exclusive_write", 00:17:23.385 "zoned": false, 00:17:23.385 "supported_io_types": { 00:17:23.385 "read": true, 00:17:23.385 "write": true, 00:17:23.385 "unmap": true, 00:17:23.385 "flush": true, 00:17:23.385 "reset": true, 00:17:23.385 "nvme_admin": false, 00:17:23.385 "nvme_io": false, 00:17:23.385 "nvme_io_md": false, 00:17:23.385 "write_zeroes": true, 00:17:23.385 "zcopy": true, 00:17:23.385 "get_zone_info": false, 00:17:23.385 "zone_management": false, 00:17:23.385 "zone_append": false, 00:17:23.385 "compare": false, 00:17:23.385 "compare_and_write": false, 00:17:23.385 "abort": true, 00:17:23.385 "seek_hole": false, 00:17:23.385 "seek_data": false, 00:17:23.385 "copy": true, 00:17:23.385 "nvme_iov_md": false 00:17:23.385 }, 00:17:23.385 "memory_domains": [ 00:17:23.385 { 00:17:23.385 "dma_device_id": "system", 00:17:23.385 "dma_device_type": 1 00:17:23.385 }, 00:17:23.385 { 00:17:23.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.385 "dma_device_type": 2 00:17:23.385 } 00:17:23.385 ], 00:17:23.385 "driver_specific": {} 00:17:23.385 } 00:17:23.385 ] 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.385 13:40:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.646 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.646 "name": "Existed_Raid", 00:17:23.646 "uuid": "8e2b0f98-385f-4c87-a4a1-cee68565647b", 00:17:23.646 "strip_size_kb": 0, 00:17:23.646 "state": "online", 00:17:23.646 "raid_level": "raid1", 00:17:23.646 "superblock": false, 00:17:23.646 "num_base_bdevs": 4, 00:17:23.646 "num_base_bdevs_discovered": 4, 00:17:23.646 "num_base_bdevs_operational": 4, 00:17:23.646 "base_bdevs_list": [ 00:17:23.646 { 00:17:23.646 "name": "BaseBdev1", 00:17:23.646 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:23.646 "is_configured": true, 00:17:23.646 "data_offset": 0, 00:17:23.646 "data_size": 65536 00:17:23.646 }, 00:17:23.646 { 00:17:23.646 "name": "BaseBdev2", 00:17:23.646 "uuid": "ee3fa4b4-f9a4-4571-97aa-28a2c6635926", 00:17:23.646 "is_configured": true, 00:17:23.646 "data_offset": 0, 00:17:23.646 "data_size": 65536 00:17:23.646 }, 00:17:23.646 { 00:17:23.646 "name": "BaseBdev3", 00:17:23.646 "uuid": "194b2d93-999e-45b2-86f3-cf0efc0ea0ff", 00:17:23.646 "is_configured": true, 00:17:23.646 "data_offset": 0, 00:17:23.646 "data_size": 65536 00:17:23.646 }, 00:17:23.646 { 00:17:23.646 "name": "BaseBdev4", 00:17:23.646 "uuid": "f83609a3-9e19-42e6-8922-21afc3dd01a2", 00:17:23.646 "is_configured": true, 00:17:23.646 "data_offset": 0, 00:17:23.646 "data_size": 65536 00:17:23.646 } 00:17:23.646 ] 00:17:23.646 }' 00:17:23.646 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.646 13:40:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:24.284 [2024-07-15 13:40:11.791999] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:24.284 "name": "Existed_Raid", 00:17:24.284 "aliases": [ 00:17:24.284 "8e2b0f98-385f-4c87-a4a1-cee68565647b" 00:17:24.284 ], 00:17:24.284 "product_name": "Raid Volume", 00:17:24.284 "block_size": 512, 00:17:24.284 "num_blocks": 65536, 00:17:24.284 "uuid": "8e2b0f98-385f-4c87-a4a1-cee68565647b", 00:17:24.284 "assigned_rate_limits": { 00:17:24.284 "rw_ios_per_sec": 0, 00:17:24.284 "rw_mbytes_per_sec": 0, 00:17:24.284 "r_mbytes_per_sec": 0, 00:17:24.284 "w_mbytes_per_sec": 0 00:17:24.284 }, 00:17:24.284 "claimed": false, 00:17:24.284 "zoned": false, 00:17:24.284 "supported_io_types": { 00:17:24.284 "read": true, 00:17:24.284 "write": true, 00:17:24.284 "unmap": false, 00:17:24.284 "flush": false, 00:17:24.284 "reset": true, 00:17:24.284 "nvme_admin": false, 00:17:24.284 "nvme_io": false, 00:17:24.284 "nvme_io_md": false, 00:17:24.284 "write_zeroes": true, 00:17:24.284 "zcopy": false, 00:17:24.284 "get_zone_info": false, 00:17:24.284 "zone_management": false, 00:17:24.284 "zone_append": false, 00:17:24.284 "compare": false, 00:17:24.284 "compare_and_write": false, 00:17:24.284 "abort": false, 00:17:24.284 "seek_hole": false, 00:17:24.284 "seek_data": false, 00:17:24.284 "copy": false, 00:17:24.284 "nvme_iov_md": false 00:17:24.284 }, 00:17:24.284 "memory_domains": [ 00:17:24.284 { 00:17:24.284 "dma_device_id": "system", 00:17:24.284 "dma_device_type": 1 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.284 "dma_device_type": 2 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "dma_device_id": "system", 00:17:24.284 "dma_device_type": 1 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.284 "dma_device_type": 2 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "dma_device_id": "system", 00:17:24.284 "dma_device_type": 1 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.284 "dma_device_type": 2 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "dma_device_id": "system", 00:17:24.284 "dma_device_type": 1 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.284 "dma_device_type": 2 00:17:24.284 } 00:17:24.284 ], 00:17:24.284 "driver_specific": { 00:17:24.284 "raid": { 00:17:24.284 "uuid": "8e2b0f98-385f-4c87-a4a1-cee68565647b", 00:17:24.284 "strip_size_kb": 0, 00:17:24.284 "state": "online", 00:17:24.284 "raid_level": "raid1", 00:17:24.284 "superblock": false, 00:17:24.284 "num_base_bdevs": 4, 00:17:24.284 "num_base_bdevs_discovered": 4, 00:17:24.284 "num_base_bdevs_operational": 4, 00:17:24.284 "base_bdevs_list": [ 00:17:24.284 { 00:17:24.284 "name": "BaseBdev1", 00:17:24.284 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:24.284 "is_configured": true, 00:17:24.284 "data_offset": 0, 00:17:24.284 "data_size": 65536 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "name": "BaseBdev2", 00:17:24.284 "uuid": "ee3fa4b4-f9a4-4571-97aa-28a2c6635926", 00:17:24.284 "is_configured": true, 00:17:24.284 "data_offset": 0, 00:17:24.284 "data_size": 65536 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "name": "BaseBdev3", 00:17:24.284 "uuid": "194b2d93-999e-45b2-86f3-cf0efc0ea0ff", 00:17:24.284 "is_configured": true, 00:17:24.284 "data_offset": 0, 00:17:24.284 "data_size": 65536 00:17:24.284 }, 00:17:24.284 { 00:17:24.284 "name": "BaseBdev4", 00:17:24.284 "uuid": "f83609a3-9e19-42e6-8922-21afc3dd01a2", 00:17:24.284 "is_configured": true, 00:17:24.284 "data_offset": 0, 00:17:24.284 "data_size": 65536 00:17:24.284 } 00:17:24.284 ] 00:17:24.284 } 00:17:24.284 } 00:17:24.284 }' 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:24.284 BaseBdev2 00:17:24.284 BaseBdev3 00:17:24.284 BaseBdev4' 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:24.284 13:40:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:24.544 "name": "BaseBdev1", 00:17:24.544 "aliases": [ 00:17:24.544 "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1" 00:17:24.544 ], 00:17:24.544 "product_name": "Malloc disk", 00:17:24.544 "block_size": 512, 00:17:24.544 "num_blocks": 65536, 00:17:24.544 "uuid": "5f6dcfe8-ece3-431a-94d5-76b1ba0a69b1", 00:17:24.544 "assigned_rate_limits": { 00:17:24.544 "rw_ios_per_sec": 0, 00:17:24.544 "rw_mbytes_per_sec": 0, 00:17:24.544 "r_mbytes_per_sec": 0, 00:17:24.544 "w_mbytes_per_sec": 0 00:17:24.544 }, 00:17:24.544 "claimed": true, 00:17:24.544 "claim_type": "exclusive_write", 00:17:24.544 "zoned": false, 00:17:24.544 "supported_io_types": { 00:17:24.544 "read": true, 00:17:24.544 "write": true, 00:17:24.544 "unmap": true, 00:17:24.544 "flush": true, 00:17:24.544 "reset": true, 00:17:24.544 "nvme_admin": false, 00:17:24.544 "nvme_io": false, 00:17:24.544 "nvme_io_md": false, 00:17:24.544 "write_zeroes": true, 00:17:24.544 "zcopy": true, 00:17:24.544 "get_zone_info": false, 00:17:24.544 "zone_management": false, 00:17:24.544 "zone_append": false, 00:17:24.544 "compare": false, 00:17:24.544 "compare_and_write": false, 00:17:24.544 "abort": true, 00:17:24.544 "seek_hole": false, 00:17:24.544 "seek_data": false, 00:17:24.544 "copy": true, 00:17:24.544 "nvme_iov_md": false 00:17:24.544 }, 00:17:24.544 "memory_domains": [ 00:17:24.544 { 00:17:24.544 "dma_device_id": "system", 00:17:24.544 "dma_device_type": 1 00:17:24.544 }, 00:17:24.544 { 00:17:24.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.544 "dma_device_type": 2 00:17:24.544 } 00:17:24.544 ], 00:17:24.544 "driver_specific": {} 00:17:24.544 }' 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:24.544 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:24.806 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.065 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.065 "name": "BaseBdev2", 00:17:25.065 "aliases": [ 00:17:25.065 "ee3fa4b4-f9a4-4571-97aa-28a2c6635926" 00:17:25.065 ], 00:17:25.065 "product_name": "Malloc disk", 00:17:25.065 "block_size": 512, 00:17:25.065 "num_blocks": 65536, 00:17:25.065 "uuid": "ee3fa4b4-f9a4-4571-97aa-28a2c6635926", 00:17:25.065 "assigned_rate_limits": { 00:17:25.065 "rw_ios_per_sec": 0, 00:17:25.065 "rw_mbytes_per_sec": 0, 00:17:25.065 "r_mbytes_per_sec": 0, 00:17:25.065 "w_mbytes_per_sec": 0 00:17:25.065 }, 00:17:25.065 "claimed": true, 00:17:25.065 "claim_type": "exclusive_write", 00:17:25.065 "zoned": false, 00:17:25.065 "supported_io_types": { 00:17:25.065 "read": true, 00:17:25.065 "write": true, 00:17:25.065 "unmap": true, 00:17:25.066 "flush": true, 00:17:25.066 "reset": true, 00:17:25.066 "nvme_admin": false, 00:17:25.066 "nvme_io": false, 00:17:25.066 "nvme_io_md": false, 00:17:25.066 "write_zeroes": true, 00:17:25.066 "zcopy": true, 00:17:25.066 "get_zone_info": false, 00:17:25.066 "zone_management": false, 00:17:25.066 "zone_append": false, 00:17:25.066 "compare": false, 00:17:25.066 "compare_and_write": false, 00:17:25.066 "abort": true, 00:17:25.066 "seek_hole": false, 00:17:25.066 "seek_data": false, 00:17:25.066 "copy": true, 00:17:25.066 "nvme_iov_md": false 00:17:25.066 }, 00:17:25.066 "memory_domains": [ 00:17:25.066 { 00:17:25.066 "dma_device_id": "system", 00:17:25.066 "dma_device_type": 1 00:17:25.066 }, 00:17:25.066 { 00:17:25.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.066 "dma_device_type": 2 00:17:25.066 } 00:17:25.066 ], 00:17:25.066 "driver_specific": {} 00:17:25.066 }' 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.066 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.324 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:25.324 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.324 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.324 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:25.324 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:25.324 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:25.324 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.583 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.583 "name": "BaseBdev3", 00:17:25.583 "aliases": [ 00:17:25.583 "194b2d93-999e-45b2-86f3-cf0efc0ea0ff" 00:17:25.583 ], 00:17:25.583 "product_name": "Malloc disk", 00:17:25.583 "block_size": 512, 00:17:25.583 "num_blocks": 65536, 00:17:25.583 "uuid": "194b2d93-999e-45b2-86f3-cf0efc0ea0ff", 00:17:25.583 "assigned_rate_limits": { 00:17:25.583 "rw_ios_per_sec": 0, 00:17:25.583 "rw_mbytes_per_sec": 0, 00:17:25.583 "r_mbytes_per_sec": 0, 00:17:25.583 "w_mbytes_per_sec": 0 00:17:25.583 }, 00:17:25.583 "claimed": true, 00:17:25.583 "claim_type": "exclusive_write", 00:17:25.583 "zoned": false, 00:17:25.583 "supported_io_types": { 00:17:25.583 "read": true, 00:17:25.583 "write": true, 00:17:25.583 "unmap": true, 00:17:25.583 "flush": true, 00:17:25.583 "reset": true, 00:17:25.583 "nvme_admin": false, 00:17:25.583 "nvme_io": false, 00:17:25.583 "nvme_io_md": false, 00:17:25.583 "write_zeroes": true, 00:17:25.583 "zcopy": true, 00:17:25.583 "get_zone_info": false, 00:17:25.583 "zone_management": false, 00:17:25.583 "zone_append": false, 00:17:25.583 "compare": false, 00:17:25.583 "compare_and_write": false, 00:17:25.583 "abort": true, 00:17:25.583 "seek_hole": false, 00:17:25.583 "seek_data": false, 00:17:25.583 "copy": true, 00:17:25.583 "nvme_iov_md": false 00:17:25.583 }, 00:17:25.583 "memory_domains": [ 00:17:25.583 { 00:17:25.583 "dma_device_id": "system", 00:17:25.583 "dma_device_type": 1 00:17:25.583 }, 00:17:25.583 { 00:17:25.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.583 "dma_device_type": 2 00:17:25.583 } 00:17:25.583 ], 00:17:25.583 "driver_specific": {} 00:17:25.583 }' 00:17:25.583 13:40:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:25.583 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.840 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.840 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:25.840 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:25.840 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:25.840 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.840 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.840 "name": "BaseBdev4", 00:17:25.841 "aliases": [ 00:17:25.841 "f83609a3-9e19-42e6-8922-21afc3dd01a2" 00:17:25.841 ], 00:17:25.841 "product_name": "Malloc disk", 00:17:25.841 "block_size": 512, 00:17:25.841 "num_blocks": 65536, 00:17:25.841 "uuid": "f83609a3-9e19-42e6-8922-21afc3dd01a2", 00:17:25.841 "assigned_rate_limits": { 00:17:25.841 "rw_ios_per_sec": 0, 00:17:25.841 "rw_mbytes_per_sec": 0, 00:17:25.841 "r_mbytes_per_sec": 0, 00:17:25.841 "w_mbytes_per_sec": 0 00:17:25.841 }, 00:17:25.841 "claimed": true, 00:17:25.841 "claim_type": "exclusive_write", 00:17:25.841 "zoned": false, 00:17:25.841 "supported_io_types": { 00:17:25.841 "read": true, 00:17:25.841 "write": true, 00:17:25.841 "unmap": true, 00:17:25.841 "flush": true, 00:17:25.841 "reset": true, 00:17:25.841 "nvme_admin": false, 00:17:25.841 "nvme_io": false, 00:17:25.841 "nvme_io_md": false, 00:17:25.841 "write_zeroes": true, 00:17:25.841 "zcopy": true, 00:17:25.841 "get_zone_info": false, 00:17:25.841 "zone_management": false, 00:17:25.841 "zone_append": false, 00:17:25.841 "compare": false, 00:17:25.841 "compare_and_write": false, 00:17:25.841 "abort": true, 00:17:25.841 "seek_hole": false, 00:17:25.841 "seek_data": false, 00:17:25.841 "copy": true, 00:17:25.841 "nvme_iov_md": false 00:17:25.841 }, 00:17:25.841 "memory_domains": [ 00:17:25.841 { 00:17:25.841 "dma_device_id": "system", 00:17:25.841 "dma_device_type": 1 00:17:25.841 }, 00:17:25.841 { 00:17:25.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.841 "dma_device_type": 2 00:17:25.841 } 00:17:25.841 ], 00:17:25.841 "driver_specific": {} 00:17:25.841 }' 00:17:25.841 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.098 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:26.357 [2024-07-15 13:40:13.897244] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.357 13:40:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.615 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.615 "name": "Existed_Raid", 00:17:26.615 "uuid": "8e2b0f98-385f-4c87-a4a1-cee68565647b", 00:17:26.615 "strip_size_kb": 0, 00:17:26.615 "state": "online", 00:17:26.615 "raid_level": "raid1", 00:17:26.615 "superblock": false, 00:17:26.615 "num_base_bdevs": 4, 00:17:26.615 "num_base_bdevs_discovered": 3, 00:17:26.615 "num_base_bdevs_operational": 3, 00:17:26.615 "base_bdevs_list": [ 00:17:26.615 { 00:17:26.615 "name": null, 00:17:26.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.615 "is_configured": false, 00:17:26.615 "data_offset": 0, 00:17:26.615 "data_size": 65536 00:17:26.615 }, 00:17:26.615 { 00:17:26.615 "name": "BaseBdev2", 00:17:26.615 "uuid": "ee3fa4b4-f9a4-4571-97aa-28a2c6635926", 00:17:26.615 "is_configured": true, 00:17:26.615 "data_offset": 0, 00:17:26.615 "data_size": 65536 00:17:26.615 }, 00:17:26.615 { 00:17:26.615 "name": "BaseBdev3", 00:17:26.615 "uuid": "194b2d93-999e-45b2-86f3-cf0efc0ea0ff", 00:17:26.615 "is_configured": true, 00:17:26.615 "data_offset": 0, 00:17:26.615 "data_size": 65536 00:17:26.615 }, 00:17:26.615 { 00:17:26.615 "name": "BaseBdev4", 00:17:26.615 "uuid": "f83609a3-9e19-42e6-8922-21afc3dd01a2", 00:17:26.615 "is_configured": true, 00:17:26.615 "data_offset": 0, 00:17:26.615 "data_size": 65536 00:17:26.615 } 00:17:26.615 ] 00:17:26.615 }' 00:17:26.615 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.615 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.182 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:27.182 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:27.182 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.182 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:27.182 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:27.182 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:27.182 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:27.441 [2024-07-15 13:40:14.952850] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:27.441 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:27.441 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:27.441 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.441 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:27.699 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:27.699 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:27.699 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:27.958 [2024-07-15 13:40:15.321404] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:27.958 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:27.958 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:27.958 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.958 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:27.958 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:27.958 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:27.958 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:28.265 [2024-07-15 13:40:15.659968] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:28.265 [2024-07-15 13:40:15.660033] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:28.265 [2024-07-15 13:40:15.670159] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:28.265 [2024-07-15 13:40:15.670186] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:28.265 [2024-07-15 13:40:15.670194] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2341840 name Existed_Raid, state offline 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:28.265 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:28.523 BaseBdev2 00:17:28.523 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:28.523 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:28.523 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:28.523 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:28.523 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:28.523 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:28.523 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.780 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:28.780 [ 00:17:28.780 { 00:17:28.780 "name": "BaseBdev2", 00:17:28.780 "aliases": [ 00:17:28.780 "a28b60af-e800-419d-a030-dc1230b0e9de" 00:17:28.780 ], 00:17:28.780 "product_name": "Malloc disk", 00:17:28.780 "block_size": 512, 00:17:28.780 "num_blocks": 65536, 00:17:28.780 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:28.780 "assigned_rate_limits": { 00:17:28.780 "rw_ios_per_sec": 0, 00:17:28.780 "rw_mbytes_per_sec": 0, 00:17:28.780 "r_mbytes_per_sec": 0, 00:17:28.780 "w_mbytes_per_sec": 0 00:17:28.780 }, 00:17:28.780 "claimed": false, 00:17:28.780 "zoned": false, 00:17:28.780 "supported_io_types": { 00:17:28.780 "read": true, 00:17:28.780 "write": true, 00:17:28.780 "unmap": true, 00:17:28.780 "flush": true, 00:17:28.780 "reset": true, 00:17:28.780 "nvme_admin": false, 00:17:28.780 "nvme_io": false, 00:17:28.780 "nvme_io_md": false, 00:17:28.780 "write_zeroes": true, 00:17:28.780 "zcopy": true, 00:17:28.780 "get_zone_info": false, 00:17:28.780 "zone_management": false, 00:17:28.780 "zone_append": false, 00:17:28.780 "compare": false, 00:17:28.780 "compare_and_write": false, 00:17:28.780 "abort": true, 00:17:28.780 "seek_hole": false, 00:17:28.780 "seek_data": false, 00:17:28.780 "copy": true, 00:17:28.780 "nvme_iov_md": false 00:17:28.780 }, 00:17:28.780 "memory_domains": [ 00:17:28.780 { 00:17:28.780 "dma_device_id": "system", 00:17:28.780 "dma_device_type": 1 00:17:28.780 }, 00:17:28.780 { 00:17:28.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.780 "dma_device_type": 2 00:17:28.780 } 00:17:28.780 ], 00:17:28.780 "driver_specific": {} 00:17:28.780 } 00:17:28.780 ] 00:17:28.780 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:28.780 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:28.780 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:28.780 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:29.037 BaseBdev3 00:17:29.037 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:29.037 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:29.037 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:29.037 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:29.037 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:29.037 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:29.037 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.295 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:29.295 [ 00:17:29.295 { 00:17:29.295 "name": "BaseBdev3", 00:17:29.295 "aliases": [ 00:17:29.295 "f813e779-b67f-4881-8686-de55d3af6806" 00:17:29.295 ], 00:17:29.295 "product_name": "Malloc disk", 00:17:29.295 "block_size": 512, 00:17:29.295 "num_blocks": 65536, 00:17:29.295 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:29.295 "assigned_rate_limits": { 00:17:29.295 "rw_ios_per_sec": 0, 00:17:29.295 "rw_mbytes_per_sec": 0, 00:17:29.295 "r_mbytes_per_sec": 0, 00:17:29.295 "w_mbytes_per_sec": 0 00:17:29.295 }, 00:17:29.295 "claimed": false, 00:17:29.295 "zoned": false, 00:17:29.295 "supported_io_types": { 00:17:29.295 "read": true, 00:17:29.295 "write": true, 00:17:29.295 "unmap": true, 00:17:29.295 "flush": true, 00:17:29.295 "reset": true, 00:17:29.295 "nvme_admin": false, 00:17:29.295 "nvme_io": false, 00:17:29.295 "nvme_io_md": false, 00:17:29.295 "write_zeroes": true, 00:17:29.295 "zcopy": true, 00:17:29.295 "get_zone_info": false, 00:17:29.295 "zone_management": false, 00:17:29.295 "zone_append": false, 00:17:29.295 "compare": false, 00:17:29.295 "compare_and_write": false, 00:17:29.295 "abort": true, 00:17:29.295 "seek_hole": false, 00:17:29.295 "seek_data": false, 00:17:29.295 "copy": true, 00:17:29.295 "nvme_iov_md": false 00:17:29.295 }, 00:17:29.295 "memory_domains": [ 00:17:29.295 { 00:17:29.295 "dma_device_id": "system", 00:17:29.295 "dma_device_type": 1 00:17:29.295 }, 00:17:29.295 { 00:17:29.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.295 "dma_device_type": 2 00:17:29.295 } 00:17:29.295 ], 00:17:29.295 "driver_specific": {} 00:17:29.295 } 00:17:29.295 ] 00:17:29.295 13:40:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:29.295 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:29.295 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:29.295 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:29.553 BaseBdev4 00:17:29.553 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:29.553 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:29.553 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:29.553 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:29.553 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:29.553 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:29.553 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.811 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:30.070 [ 00:17:30.070 { 00:17:30.070 "name": "BaseBdev4", 00:17:30.070 "aliases": [ 00:17:30.070 "fbef40a0-54da-454a-b2ce-08d2437702c6" 00:17:30.070 ], 00:17:30.070 "product_name": "Malloc disk", 00:17:30.070 "block_size": 512, 00:17:30.070 "num_blocks": 65536, 00:17:30.070 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:30.070 "assigned_rate_limits": { 00:17:30.070 "rw_ios_per_sec": 0, 00:17:30.070 "rw_mbytes_per_sec": 0, 00:17:30.070 "r_mbytes_per_sec": 0, 00:17:30.070 "w_mbytes_per_sec": 0 00:17:30.070 }, 00:17:30.070 "claimed": false, 00:17:30.070 "zoned": false, 00:17:30.070 "supported_io_types": { 00:17:30.070 "read": true, 00:17:30.070 "write": true, 00:17:30.070 "unmap": true, 00:17:30.070 "flush": true, 00:17:30.070 "reset": true, 00:17:30.070 "nvme_admin": false, 00:17:30.070 "nvme_io": false, 00:17:30.070 "nvme_io_md": false, 00:17:30.070 "write_zeroes": true, 00:17:30.070 "zcopy": true, 00:17:30.070 "get_zone_info": false, 00:17:30.070 "zone_management": false, 00:17:30.070 "zone_append": false, 00:17:30.070 "compare": false, 00:17:30.070 "compare_and_write": false, 00:17:30.070 "abort": true, 00:17:30.070 "seek_hole": false, 00:17:30.070 "seek_data": false, 00:17:30.070 "copy": true, 00:17:30.070 "nvme_iov_md": false 00:17:30.070 }, 00:17:30.070 "memory_domains": [ 00:17:30.070 { 00:17:30.070 "dma_device_id": "system", 00:17:30.070 "dma_device_type": 1 00:17:30.070 }, 00:17:30.070 { 00:17:30.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.070 "dma_device_type": 2 00:17:30.070 } 00:17:30.070 ], 00:17:30.070 "driver_specific": {} 00:17:30.070 } 00:17:30.070 ] 00:17:30.070 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:30.071 [2024-07-15 13:40:17.627985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:30.071 [2024-07-15 13:40:17.628037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:30.071 [2024-07-15 13:40:17.628051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:30.071 [2024-07-15 13:40:17.629075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:30.071 [2024-07-15 13:40:17.629105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.071 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.329 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.329 "name": "Existed_Raid", 00:17:30.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:30.329 "strip_size_kb": 0, 00:17:30.329 "state": "configuring", 00:17:30.329 "raid_level": "raid1", 00:17:30.329 "superblock": false, 00:17:30.329 "num_base_bdevs": 4, 00:17:30.329 "num_base_bdevs_discovered": 3, 00:17:30.329 "num_base_bdevs_operational": 4, 00:17:30.329 "base_bdevs_list": [ 00:17:30.329 { 00:17:30.329 "name": "BaseBdev1", 00:17:30.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:30.329 "is_configured": false, 00:17:30.329 "data_offset": 0, 00:17:30.329 "data_size": 0 00:17:30.329 }, 00:17:30.329 { 00:17:30.329 "name": "BaseBdev2", 00:17:30.329 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:30.329 "is_configured": true, 00:17:30.329 "data_offset": 0, 00:17:30.329 "data_size": 65536 00:17:30.329 }, 00:17:30.329 { 00:17:30.329 "name": "BaseBdev3", 00:17:30.329 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:30.329 "is_configured": true, 00:17:30.329 "data_offset": 0, 00:17:30.329 "data_size": 65536 00:17:30.329 }, 00:17:30.329 { 00:17:30.329 "name": "BaseBdev4", 00:17:30.329 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:30.329 "is_configured": true, 00:17:30.329 "data_offset": 0, 00:17:30.329 "data_size": 65536 00:17:30.329 } 00:17:30.329 ] 00:17:30.329 }' 00:17:30.329 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.329 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:30.894 [2024-07-15 13:40:18.474175] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.894 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.152 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.152 "name": "Existed_Raid", 00:17:31.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.152 "strip_size_kb": 0, 00:17:31.152 "state": "configuring", 00:17:31.152 "raid_level": "raid1", 00:17:31.152 "superblock": false, 00:17:31.152 "num_base_bdevs": 4, 00:17:31.152 "num_base_bdevs_discovered": 2, 00:17:31.152 "num_base_bdevs_operational": 4, 00:17:31.152 "base_bdevs_list": [ 00:17:31.152 { 00:17:31.152 "name": "BaseBdev1", 00:17:31.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.152 "is_configured": false, 00:17:31.152 "data_offset": 0, 00:17:31.152 "data_size": 0 00:17:31.152 }, 00:17:31.152 { 00:17:31.152 "name": null, 00:17:31.152 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:31.152 "is_configured": false, 00:17:31.152 "data_offset": 0, 00:17:31.152 "data_size": 65536 00:17:31.152 }, 00:17:31.152 { 00:17:31.152 "name": "BaseBdev3", 00:17:31.152 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:31.152 "is_configured": true, 00:17:31.152 "data_offset": 0, 00:17:31.152 "data_size": 65536 00:17:31.152 }, 00:17:31.152 { 00:17:31.152 "name": "BaseBdev4", 00:17:31.152 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:31.152 "is_configured": true, 00:17:31.152 "data_offset": 0, 00:17:31.152 "data_size": 65536 00:17:31.152 } 00:17:31.152 ] 00:17:31.152 }' 00:17:31.152 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.152 13:40:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.719 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:31.719 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.719 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:31.719 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:31.977 [2024-07-15 13:40:19.476763] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:31.977 BaseBdev1 00:17:31.977 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:31.977 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:31.977 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:31.977 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:31.977 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:31.977 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:31.977 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:32.236 [ 00:17:32.236 { 00:17:32.236 "name": "BaseBdev1", 00:17:32.236 "aliases": [ 00:17:32.236 "e1844065-e9c9-459e-9ea2-06232c66b5ad" 00:17:32.236 ], 00:17:32.236 "product_name": "Malloc disk", 00:17:32.236 "block_size": 512, 00:17:32.236 "num_blocks": 65536, 00:17:32.236 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:32.236 "assigned_rate_limits": { 00:17:32.236 "rw_ios_per_sec": 0, 00:17:32.236 "rw_mbytes_per_sec": 0, 00:17:32.236 "r_mbytes_per_sec": 0, 00:17:32.236 "w_mbytes_per_sec": 0 00:17:32.236 }, 00:17:32.236 "claimed": true, 00:17:32.236 "claim_type": "exclusive_write", 00:17:32.236 "zoned": false, 00:17:32.236 "supported_io_types": { 00:17:32.236 "read": true, 00:17:32.236 "write": true, 00:17:32.236 "unmap": true, 00:17:32.236 "flush": true, 00:17:32.236 "reset": true, 00:17:32.236 "nvme_admin": false, 00:17:32.236 "nvme_io": false, 00:17:32.236 "nvme_io_md": false, 00:17:32.236 "write_zeroes": true, 00:17:32.236 "zcopy": true, 00:17:32.236 "get_zone_info": false, 00:17:32.236 "zone_management": false, 00:17:32.236 "zone_append": false, 00:17:32.236 "compare": false, 00:17:32.236 "compare_and_write": false, 00:17:32.236 "abort": true, 00:17:32.236 "seek_hole": false, 00:17:32.236 "seek_data": false, 00:17:32.236 "copy": true, 00:17:32.236 "nvme_iov_md": false 00:17:32.236 }, 00:17:32.236 "memory_domains": [ 00:17:32.236 { 00:17:32.236 "dma_device_id": "system", 00:17:32.236 "dma_device_type": 1 00:17:32.236 }, 00:17:32.236 { 00:17:32.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.236 "dma_device_type": 2 00:17:32.236 } 00:17:32.236 ], 00:17:32.236 "driver_specific": {} 00:17:32.236 } 00:17:32.236 ] 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.236 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.495 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.495 "name": "Existed_Raid", 00:17:32.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.495 "strip_size_kb": 0, 00:17:32.495 "state": "configuring", 00:17:32.495 "raid_level": "raid1", 00:17:32.495 "superblock": false, 00:17:32.495 "num_base_bdevs": 4, 00:17:32.495 "num_base_bdevs_discovered": 3, 00:17:32.495 "num_base_bdevs_operational": 4, 00:17:32.495 "base_bdevs_list": [ 00:17:32.495 { 00:17:32.495 "name": "BaseBdev1", 00:17:32.495 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:32.495 "is_configured": true, 00:17:32.495 "data_offset": 0, 00:17:32.495 "data_size": 65536 00:17:32.495 }, 00:17:32.495 { 00:17:32.495 "name": null, 00:17:32.495 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:32.495 "is_configured": false, 00:17:32.495 "data_offset": 0, 00:17:32.495 "data_size": 65536 00:17:32.495 }, 00:17:32.495 { 00:17:32.495 "name": "BaseBdev3", 00:17:32.495 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:32.495 "is_configured": true, 00:17:32.495 "data_offset": 0, 00:17:32.495 "data_size": 65536 00:17:32.495 }, 00:17:32.495 { 00:17:32.495 "name": "BaseBdev4", 00:17:32.495 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:32.495 "is_configured": true, 00:17:32.495 "data_offset": 0, 00:17:32.495 "data_size": 65536 00:17:32.495 } 00:17:32.495 ] 00:17:32.495 }' 00:17:32.495 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.495 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.062 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.062 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:33.062 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:33.062 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:33.355 [2024-07-15 13:40:20.836285] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:33.355 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:33.355 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.356 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.614 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.614 "name": "Existed_Raid", 00:17:33.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.614 "strip_size_kb": 0, 00:17:33.614 "state": "configuring", 00:17:33.614 "raid_level": "raid1", 00:17:33.614 "superblock": false, 00:17:33.614 "num_base_bdevs": 4, 00:17:33.614 "num_base_bdevs_discovered": 2, 00:17:33.614 "num_base_bdevs_operational": 4, 00:17:33.614 "base_bdevs_list": [ 00:17:33.614 { 00:17:33.614 "name": "BaseBdev1", 00:17:33.614 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:33.614 "is_configured": true, 00:17:33.614 "data_offset": 0, 00:17:33.614 "data_size": 65536 00:17:33.614 }, 00:17:33.614 { 00:17:33.614 "name": null, 00:17:33.614 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:33.614 "is_configured": false, 00:17:33.614 "data_offset": 0, 00:17:33.614 "data_size": 65536 00:17:33.614 }, 00:17:33.614 { 00:17:33.614 "name": null, 00:17:33.615 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:33.615 "is_configured": false, 00:17:33.615 "data_offset": 0, 00:17:33.615 "data_size": 65536 00:17:33.615 }, 00:17:33.615 { 00:17:33.615 "name": "BaseBdev4", 00:17:33.615 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:33.615 "is_configured": true, 00:17:33.615 "data_offset": 0, 00:17:33.615 "data_size": 65536 00:17:33.615 } 00:17:33.615 ] 00:17:33.615 }' 00:17:33.615 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.615 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.182 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.182 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:34.182 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:34.182 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:34.440 [2024-07-15 13:40:21.830860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:34.440 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:34.440 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:34.440 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:34.440 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:34.440 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:34.440 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:34.441 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.441 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.441 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.441 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.441 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.441 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:34.441 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.441 "name": "Existed_Raid", 00:17:34.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.441 "strip_size_kb": 0, 00:17:34.441 "state": "configuring", 00:17:34.441 "raid_level": "raid1", 00:17:34.441 "superblock": false, 00:17:34.441 "num_base_bdevs": 4, 00:17:34.441 "num_base_bdevs_discovered": 3, 00:17:34.441 "num_base_bdevs_operational": 4, 00:17:34.441 "base_bdevs_list": [ 00:17:34.441 { 00:17:34.441 "name": "BaseBdev1", 00:17:34.441 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:34.441 "is_configured": true, 00:17:34.441 "data_offset": 0, 00:17:34.441 "data_size": 65536 00:17:34.441 }, 00:17:34.441 { 00:17:34.441 "name": null, 00:17:34.441 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:34.441 "is_configured": false, 00:17:34.441 "data_offset": 0, 00:17:34.441 "data_size": 65536 00:17:34.441 }, 00:17:34.441 { 00:17:34.441 "name": "BaseBdev3", 00:17:34.441 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:34.441 "is_configured": true, 00:17:34.441 "data_offset": 0, 00:17:34.441 "data_size": 65536 00:17:34.441 }, 00:17:34.441 { 00:17:34.441 "name": "BaseBdev4", 00:17:34.441 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:34.441 "is_configured": true, 00:17:34.441 "data_offset": 0, 00:17:34.441 "data_size": 65536 00:17:34.441 } 00:17:34.441 ] 00:17:34.441 }' 00:17:34.441 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.441 13:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.008 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.008 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:35.266 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:35.266 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:35.266 [2024-07-15 13:40:22.873569] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.525 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.525 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.525 "name": "Existed_Raid", 00:17:35.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.525 "strip_size_kb": 0, 00:17:35.525 "state": "configuring", 00:17:35.525 "raid_level": "raid1", 00:17:35.525 "superblock": false, 00:17:35.525 "num_base_bdevs": 4, 00:17:35.525 "num_base_bdevs_discovered": 2, 00:17:35.525 "num_base_bdevs_operational": 4, 00:17:35.525 "base_bdevs_list": [ 00:17:35.525 { 00:17:35.525 "name": null, 00:17:35.525 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:35.525 "is_configured": false, 00:17:35.525 "data_offset": 0, 00:17:35.525 "data_size": 65536 00:17:35.525 }, 00:17:35.525 { 00:17:35.525 "name": null, 00:17:35.525 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:35.525 "is_configured": false, 00:17:35.525 "data_offset": 0, 00:17:35.525 "data_size": 65536 00:17:35.525 }, 00:17:35.525 { 00:17:35.525 "name": "BaseBdev3", 00:17:35.525 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:35.525 "is_configured": true, 00:17:35.525 "data_offset": 0, 00:17:35.525 "data_size": 65536 00:17:35.525 }, 00:17:35.525 { 00:17:35.525 "name": "BaseBdev4", 00:17:35.525 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:35.525 "is_configured": true, 00:17:35.525 "data_offset": 0, 00:17:35.525 "data_size": 65536 00:17:35.525 } 00:17:35.525 ] 00:17:35.525 }' 00:17:35.525 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.525 13:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.091 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.091 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:36.350 [2024-07-15 13:40:23.879930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.350 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.610 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.610 "name": "Existed_Raid", 00:17:36.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.610 "strip_size_kb": 0, 00:17:36.610 "state": "configuring", 00:17:36.610 "raid_level": "raid1", 00:17:36.610 "superblock": false, 00:17:36.610 "num_base_bdevs": 4, 00:17:36.610 "num_base_bdevs_discovered": 3, 00:17:36.610 "num_base_bdevs_operational": 4, 00:17:36.610 "base_bdevs_list": [ 00:17:36.610 { 00:17:36.610 "name": null, 00:17:36.610 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:36.610 "is_configured": false, 00:17:36.610 "data_offset": 0, 00:17:36.610 "data_size": 65536 00:17:36.610 }, 00:17:36.610 { 00:17:36.610 "name": "BaseBdev2", 00:17:36.610 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:36.610 "is_configured": true, 00:17:36.610 "data_offset": 0, 00:17:36.610 "data_size": 65536 00:17:36.610 }, 00:17:36.610 { 00:17:36.610 "name": "BaseBdev3", 00:17:36.610 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:36.610 "is_configured": true, 00:17:36.610 "data_offset": 0, 00:17:36.610 "data_size": 65536 00:17:36.610 }, 00:17:36.610 { 00:17:36.610 "name": "BaseBdev4", 00:17:36.610 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:36.610 "is_configured": true, 00:17:36.610 "data_offset": 0, 00:17:36.610 "data_size": 65536 00:17:36.610 } 00:17:36.610 ] 00:17:36.610 }' 00:17:36.610 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.610 13:40:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.178 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.178 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:37.178 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:37.178 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.178 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:37.437 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e1844065-e9c9-459e-9ea2-06232c66b5ad 00:17:37.437 [2024-07-15 13:40:25.042199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:37.437 [2024-07-15 13:40:25.042231] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2348a20 00:17:37.437 [2024-07-15 13:40:25.042236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:37.437 [2024-07-15 13:40:25.042366] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2347e60 00:17:37.437 [2024-07-15 13:40:25.042453] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2348a20 00:17:37.437 [2024-07-15 13:40:25.042459] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2348a20 00:17:37.437 [2024-07-15 13:40:25.042593] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:37.437 NewBaseBdev 00:17:37.437 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:37.437 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:37.437 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:37.696 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:37.696 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:37.696 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:37.696 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.696 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:37.955 [ 00:17:37.955 { 00:17:37.955 "name": "NewBaseBdev", 00:17:37.955 "aliases": [ 00:17:37.955 "e1844065-e9c9-459e-9ea2-06232c66b5ad" 00:17:37.955 ], 00:17:37.955 "product_name": "Malloc disk", 00:17:37.955 "block_size": 512, 00:17:37.955 "num_blocks": 65536, 00:17:37.955 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:37.955 "assigned_rate_limits": { 00:17:37.955 "rw_ios_per_sec": 0, 00:17:37.955 "rw_mbytes_per_sec": 0, 00:17:37.955 "r_mbytes_per_sec": 0, 00:17:37.955 "w_mbytes_per_sec": 0 00:17:37.955 }, 00:17:37.955 "claimed": true, 00:17:37.955 "claim_type": "exclusive_write", 00:17:37.955 "zoned": false, 00:17:37.955 "supported_io_types": { 00:17:37.955 "read": true, 00:17:37.955 "write": true, 00:17:37.955 "unmap": true, 00:17:37.955 "flush": true, 00:17:37.955 "reset": true, 00:17:37.955 "nvme_admin": false, 00:17:37.955 "nvme_io": false, 00:17:37.955 "nvme_io_md": false, 00:17:37.955 "write_zeroes": true, 00:17:37.955 "zcopy": true, 00:17:37.955 "get_zone_info": false, 00:17:37.955 "zone_management": false, 00:17:37.955 "zone_append": false, 00:17:37.955 "compare": false, 00:17:37.955 "compare_and_write": false, 00:17:37.955 "abort": true, 00:17:37.955 "seek_hole": false, 00:17:37.955 "seek_data": false, 00:17:37.955 "copy": true, 00:17:37.955 "nvme_iov_md": false 00:17:37.955 }, 00:17:37.955 "memory_domains": [ 00:17:37.955 { 00:17:37.955 "dma_device_id": "system", 00:17:37.955 "dma_device_type": 1 00:17:37.955 }, 00:17:37.955 { 00:17:37.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.955 "dma_device_type": 2 00:17:37.955 } 00:17:37.955 ], 00:17:37.955 "driver_specific": {} 00:17:37.955 } 00:17:37.955 ] 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.955 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.956 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.956 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.956 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.956 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.956 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.956 "name": "Existed_Raid", 00:17:37.956 "uuid": "ef27f26c-4092-4694-9071-9599975805ac", 00:17:37.956 "strip_size_kb": 0, 00:17:37.956 "state": "online", 00:17:37.956 "raid_level": "raid1", 00:17:37.956 "superblock": false, 00:17:37.956 "num_base_bdevs": 4, 00:17:37.956 "num_base_bdevs_discovered": 4, 00:17:37.956 "num_base_bdevs_operational": 4, 00:17:37.956 "base_bdevs_list": [ 00:17:37.956 { 00:17:37.956 "name": "NewBaseBdev", 00:17:37.956 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:37.956 "is_configured": true, 00:17:37.956 "data_offset": 0, 00:17:37.956 "data_size": 65536 00:17:37.956 }, 00:17:37.956 { 00:17:37.956 "name": "BaseBdev2", 00:17:37.956 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:37.956 "is_configured": true, 00:17:37.956 "data_offset": 0, 00:17:37.956 "data_size": 65536 00:17:37.956 }, 00:17:37.956 { 00:17:37.956 "name": "BaseBdev3", 00:17:37.956 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:37.956 "is_configured": true, 00:17:37.956 "data_offset": 0, 00:17:37.956 "data_size": 65536 00:17:37.956 }, 00:17:37.956 { 00:17:37.956 "name": "BaseBdev4", 00:17:37.956 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:37.956 "is_configured": true, 00:17:37.956 "data_offset": 0, 00:17:37.956 "data_size": 65536 00:17:37.956 } 00:17:37.956 ] 00:17:37.956 }' 00:17:37.956 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.956 13:40:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:38.524 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:38.782 [2024-07-15 13:40:26.201472] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:38.782 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:38.782 "name": "Existed_Raid", 00:17:38.782 "aliases": [ 00:17:38.782 "ef27f26c-4092-4694-9071-9599975805ac" 00:17:38.782 ], 00:17:38.782 "product_name": "Raid Volume", 00:17:38.782 "block_size": 512, 00:17:38.782 "num_blocks": 65536, 00:17:38.782 "uuid": "ef27f26c-4092-4694-9071-9599975805ac", 00:17:38.782 "assigned_rate_limits": { 00:17:38.782 "rw_ios_per_sec": 0, 00:17:38.782 "rw_mbytes_per_sec": 0, 00:17:38.782 "r_mbytes_per_sec": 0, 00:17:38.782 "w_mbytes_per_sec": 0 00:17:38.782 }, 00:17:38.782 "claimed": false, 00:17:38.782 "zoned": false, 00:17:38.782 "supported_io_types": { 00:17:38.782 "read": true, 00:17:38.782 "write": true, 00:17:38.782 "unmap": false, 00:17:38.782 "flush": false, 00:17:38.782 "reset": true, 00:17:38.782 "nvme_admin": false, 00:17:38.782 "nvme_io": false, 00:17:38.782 "nvme_io_md": false, 00:17:38.782 "write_zeroes": true, 00:17:38.782 "zcopy": false, 00:17:38.782 "get_zone_info": false, 00:17:38.782 "zone_management": false, 00:17:38.782 "zone_append": false, 00:17:38.782 "compare": false, 00:17:38.782 "compare_and_write": false, 00:17:38.782 "abort": false, 00:17:38.782 "seek_hole": false, 00:17:38.782 "seek_data": false, 00:17:38.782 "copy": false, 00:17:38.782 "nvme_iov_md": false 00:17:38.782 }, 00:17:38.782 "memory_domains": [ 00:17:38.782 { 00:17:38.782 "dma_device_id": "system", 00:17:38.782 "dma_device_type": 1 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.782 "dma_device_type": 2 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "dma_device_id": "system", 00:17:38.782 "dma_device_type": 1 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.782 "dma_device_type": 2 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "dma_device_id": "system", 00:17:38.782 "dma_device_type": 1 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.782 "dma_device_type": 2 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "dma_device_id": "system", 00:17:38.782 "dma_device_type": 1 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.782 "dma_device_type": 2 00:17:38.782 } 00:17:38.782 ], 00:17:38.782 "driver_specific": { 00:17:38.782 "raid": { 00:17:38.782 "uuid": "ef27f26c-4092-4694-9071-9599975805ac", 00:17:38.782 "strip_size_kb": 0, 00:17:38.782 "state": "online", 00:17:38.782 "raid_level": "raid1", 00:17:38.782 "superblock": false, 00:17:38.782 "num_base_bdevs": 4, 00:17:38.782 "num_base_bdevs_discovered": 4, 00:17:38.782 "num_base_bdevs_operational": 4, 00:17:38.782 "base_bdevs_list": [ 00:17:38.782 { 00:17:38.782 "name": "NewBaseBdev", 00:17:38.782 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:38.782 "is_configured": true, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 65536 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "name": "BaseBdev2", 00:17:38.782 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:38.782 "is_configured": true, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 65536 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "name": "BaseBdev3", 00:17:38.782 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:38.782 "is_configured": true, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 65536 00:17:38.782 }, 00:17:38.782 { 00:17:38.782 "name": "BaseBdev4", 00:17:38.782 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:38.782 "is_configured": true, 00:17:38.782 "data_offset": 0, 00:17:38.782 "data_size": 65536 00:17:38.782 } 00:17:38.782 ] 00:17:38.782 } 00:17:38.782 } 00:17:38.782 }' 00:17:38.782 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:38.782 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:38.782 BaseBdev2 00:17:38.782 BaseBdev3 00:17:38.782 BaseBdev4' 00:17:38.782 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:38.783 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:38.783 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:39.041 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:39.041 "name": "NewBaseBdev", 00:17:39.041 "aliases": [ 00:17:39.041 "e1844065-e9c9-459e-9ea2-06232c66b5ad" 00:17:39.041 ], 00:17:39.041 "product_name": "Malloc disk", 00:17:39.041 "block_size": 512, 00:17:39.041 "num_blocks": 65536, 00:17:39.041 "uuid": "e1844065-e9c9-459e-9ea2-06232c66b5ad", 00:17:39.041 "assigned_rate_limits": { 00:17:39.041 "rw_ios_per_sec": 0, 00:17:39.041 "rw_mbytes_per_sec": 0, 00:17:39.041 "r_mbytes_per_sec": 0, 00:17:39.041 "w_mbytes_per_sec": 0 00:17:39.041 }, 00:17:39.041 "claimed": true, 00:17:39.041 "claim_type": "exclusive_write", 00:17:39.041 "zoned": false, 00:17:39.041 "supported_io_types": { 00:17:39.041 "read": true, 00:17:39.041 "write": true, 00:17:39.041 "unmap": true, 00:17:39.041 "flush": true, 00:17:39.041 "reset": true, 00:17:39.041 "nvme_admin": false, 00:17:39.041 "nvme_io": false, 00:17:39.041 "nvme_io_md": false, 00:17:39.041 "write_zeroes": true, 00:17:39.041 "zcopy": true, 00:17:39.041 "get_zone_info": false, 00:17:39.041 "zone_management": false, 00:17:39.041 "zone_append": false, 00:17:39.041 "compare": false, 00:17:39.041 "compare_and_write": false, 00:17:39.041 "abort": true, 00:17:39.041 "seek_hole": false, 00:17:39.041 "seek_data": false, 00:17:39.041 "copy": true, 00:17:39.041 "nvme_iov_md": false 00:17:39.041 }, 00:17:39.041 "memory_domains": [ 00:17:39.041 { 00:17:39.041 "dma_device_id": "system", 00:17:39.042 "dma_device_type": 1 00:17:39.042 }, 00:17:39.042 { 00:17:39.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.042 "dma_device_type": 2 00:17:39.042 } 00:17:39.042 ], 00:17:39.042 "driver_specific": {} 00:17:39.042 }' 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.042 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.301 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.301 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.301 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.301 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:39.301 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:39.301 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:39.301 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:39.560 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:39.560 "name": "BaseBdev2", 00:17:39.560 "aliases": [ 00:17:39.560 "a28b60af-e800-419d-a030-dc1230b0e9de" 00:17:39.560 ], 00:17:39.560 "product_name": "Malloc disk", 00:17:39.560 "block_size": 512, 00:17:39.560 "num_blocks": 65536, 00:17:39.560 "uuid": "a28b60af-e800-419d-a030-dc1230b0e9de", 00:17:39.560 "assigned_rate_limits": { 00:17:39.560 "rw_ios_per_sec": 0, 00:17:39.560 "rw_mbytes_per_sec": 0, 00:17:39.560 "r_mbytes_per_sec": 0, 00:17:39.560 "w_mbytes_per_sec": 0 00:17:39.560 }, 00:17:39.560 "claimed": true, 00:17:39.560 "claim_type": "exclusive_write", 00:17:39.560 "zoned": false, 00:17:39.560 "supported_io_types": { 00:17:39.560 "read": true, 00:17:39.560 "write": true, 00:17:39.560 "unmap": true, 00:17:39.560 "flush": true, 00:17:39.560 "reset": true, 00:17:39.560 "nvme_admin": false, 00:17:39.560 "nvme_io": false, 00:17:39.560 "nvme_io_md": false, 00:17:39.560 "write_zeroes": true, 00:17:39.560 "zcopy": true, 00:17:39.560 "get_zone_info": false, 00:17:39.560 "zone_management": false, 00:17:39.560 "zone_append": false, 00:17:39.560 "compare": false, 00:17:39.560 "compare_and_write": false, 00:17:39.560 "abort": true, 00:17:39.560 "seek_hole": false, 00:17:39.560 "seek_data": false, 00:17:39.560 "copy": true, 00:17:39.560 "nvme_iov_md": false 00:17:39.560 }, 00:17:39.560 "memory_domains": [ 00:17:39.560 { 00:17:39.560 "dma_device_id": "system", 00:17:39.560 "dma_device_type": 1 00:17:39.560 }, 00:17:39.560 { 00:17:39.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.560 "dma_device_type": 2 00:17:39.560 } 00:17:39.560 ], 00:17:39.560 "driver_specific": {} 00:17:39.560 }' 00:17:39.560 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.560 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.560 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.820 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.820 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:39.820 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:39.820 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:39.820 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:39.820 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:39.820 "name": "BaseBdev3", 00:17:39.820 "aliases": [ 00:17:39.820 "f813e779-b67f-4881-8686-de55d3af6806" 00:17:39.820 ], 00:17:39.820 "product_name": "Malloc disk", 00:17:39.820 "block_size": 512, 00:17:39.820 "num_blocks": 65536, 00:17:39.820 "uuid": "f813e779-b67f-4881-8686-de55d3af6806", 00:17:39.820 "assigned_rate_limits": { 00:17:39.820 "rw_ios_per_sec": 0, 00:17:39.820 "rw_mbytes_per_sec": 0, 00:17:39.820 "r_mbytes_per_sec": 0, 00:17:39.820 "w_mbytes_per_sec": 0 00:17:39.820 }, 00:17:39.820 "claimed": true, 00:17:39.820 "claim_type": "exclusive_write", 00:17:39.820 "zoned": false, 00:17:39.820 "supported_io_types": { 00:17:39.820 "read": true, 00:17:39.820 "write": true, 00:17:39.820 "unmap": true, 00:17:39.820 "flush": true, 00:17:39.820 "reset": true, 00:17:39.820 "nvme_admin": false, 00:17:39.820 "nvme_io": false, 00:17:39.820 "nvme_io_md": false, 00:17:39.820 "write_zeroes": true, 00:17:39.820 "zcopy": true, 00:17:39.820 "get_zone_info": false, 00:17:39.820 "zone_management": false, 00:17:39.820 "zone_append": false, 00:17:39.820 "compare": false, 00:17:39.820 "compare_and_write": false, 00:17:39.820 "abort": true, 00:17:39.820 "seek_hole": false, 00:17:39.820 "seek_data": false, 00:17:39.820 "copy": true, 00:17:39.820 "nvme_iov_md": false 00:17:39.820 }, 00:17:39.820 "memory_domains": [ 00:17:39.820 { 00:17:39.820 "dma_device_id": "system", 00:17:39.820 "dma_device_type": 1 00:17:39.820 }, 00:17:39.820 { 00:17:39.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.820 "dma_device_type": 2 00:17:39.820 } 00:17:39.820 ], 00:17:39.820 "driver_specific": {} 00:17:39.820 }' 00:17:39.820 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.078 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.337 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.337 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.337 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:40.337 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:40.337 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:40.337 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:40.337 "name": "BaseBdev4", 00:17:40.337 "aliases": [ 00:17:40.337 "fbef40a0-54da-454a-b2ce-08d2437702c6" 00:17:40.337 ], 00:17:40.337 "product_name": "Malloc disk", 00:17:40.337 "block_size": 512, 00:17:40.337 "num_blocks": 65536, 00:17:40.337 "uuid": "fbef40a0-54da-454a-b2ce-08d2437702c6", 00:17:40.337 "assigned_rate_limits": { 00:17:40.337 "rw_ios_per_sec": 0, 00:17:40.337 "rw_mbytes_per_sec": 0, 00:17:40.337 "r_mbytes_per_sec": 0, 00:17:40.337 "w_mbytes_per_sec": 0 00:17:40.337 }, 00:17:40.337 "claimed": true, 00:17:40.337 "claim_type": "exclusive_write", 00:17:40.337 "zoned": false, 00:17:40.337 "supported_io_types": { 00:17:40.337 "read": true, 00:17:40.337 "write": true, 00:17:40.337 "unmap": true, 00:17:40.337 "flush": true, 00:17:40.337 "reset": true, 00:17:40.337 "nvme_admin": false, 00:17:40.337 "nvme_io": false, 00:17:40.337 "nvme_io_md": false, 00:17:40.337 "write_zeroes": true, 00:17:40.337 "zcopy": true, 00:17:40.337 "get_zone_info": false, 00:17:40.337 "zone_management": false, 00:17:40.337 "zone_append": false, 00:17:40.337 "compare": false, 00:17:40.337 "compare_and_write": false, 00:17:40.337 "abort": true, 00:17:40.337 "seek_hole": false, 00:17:40.337 "seek_data": false, 00:17:40.337 "copy": true, 00:17:40.337 "nvme_iov_md": false 00:17:40.337 }, 00:17:40.337 "memory_domains": [ 00:17:40.337 { 00:17:40.337 "dma_device_id": "system", 00:17:40.337 "dma_device_type": 1 00:17:40.337 }, 00:17:40.337 { 00:17:40.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.337 "dma_device_type": 2 00:17:40.337 } 00:17:40.337 ], 00:17:40.337 "driver_specific": {} 00:17:40.337 }' 00:17:40.337 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.597 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.597 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.597 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.856 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.856 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:40.856 [2024-07-15 13:40:28.374835] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:40.856 [2024-07-15 13:40:28.374859] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:40.857 [2024-07-15 13:40:28.374899] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:40.857 [2024-07-15 13:40:28.375116] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:40.857 [2024-07-15 13:40:28.375125] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2348a20 name Existed_Raid, state offline 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 45156 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 45156 ']' 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 45156 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 45156 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 45156' 00:17:40.857 killing process with pid 45156 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 45156 00:17:40.857 [2024-07-15 13:40:28.440369] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:40.857 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 45156 00:17:41.115 [2024-07-15 13:40:28.475693] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:41.115 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:41.115 00:17:41.115 real 0m24.912s 00:17:41.115 user 0m45.360s 00:17:41.115 sys 0m4.889s 00:17:41.115 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:41.115 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.115 ************************************ 00:17:41.115 END TEST raid_state_function_test 00:17:41.115 ************************************ 00:17:41.115 13:40:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:41.115 13:40:28 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:17:41.115 13:40:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:41.115 13:40:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:41.115 13:40:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:41.374 ************************************ 00:17:41.374 START TEST raid_state_function_test_sb 00:17:41.374 ************************************ 00:17:41.374 13:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:17:41.374 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:41.374 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=49077 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 49077' 00:17:41.375 Process raid pid: 49077 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 49077 /var/tmp/spdk-raid.sock 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 49077 ']' 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:41.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:41.375 13:40:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:41.375 [2024-07-15 13:40:28.839056] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:17:41.375 [2024-07-15 13:40:28.839115] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:41.375 [2024-07-15 13:40:28.926848] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:41.634 [2024-07-15 13:40:29.015884] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:41.634 [2024-07-15 13:40:29.072681] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:41.634 [2024-07-15 13:40:29.072713] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:42.203 [2024-07-15 13:40:29.788298] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:42.203 [2024-07-15 13:40:29.788334] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:42.203 [2024-07-15 13:40:29.788341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:42.203 [2024-07-15 13:40:29.788349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:42.203 [2024-07-15 13:40:29.788354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:42.203 [2024-07-15 13:40:29.788361] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:42.203 [2024-07-15 13:40:29.788366] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:42.203 [2024-07-15 13:40:29.788373] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.203 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.462 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.462 "name": "Existed_Raid", 00:17:42.462 "uuid": "e637abec-822a-4d8c-8c0a-72e3910cc15e", 00:17:42.462 "strip_size_kb": 0, 00:17:42.462 "state": "configuring", 00:17:42.462 "raid_level": "raid1", 00:17:42.462 "superblock": true, 00:17:42.462 "num_base_bdevs": 4, 00:17:42.462 "num_base_bdevs_discovered": 0, 00:17:42.462 "num_base_bdevs_operational": 4, 00:17:42.462 "base_bdevs_list": [ 00:17:42.462 { 00:17:42.462 "name": "BaseBdev1", 00:17:42.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.462 "is_configured": false, 00:17:42.462 "data_offset": 0, 00:17:42.462 "data_size": 0 00:17:42.462 }, 00:17:42.462 { 00:17:42.462 "name": "BaseBdev2", 00:17:42.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.462 "is_configured": false, 00:17:42.462 "data_offset": 0, 00:17:42.462 "data_size": 0 00:17:42.462 }, 00:17:42.462 { 00:17:42.462 "name": "BaseBdev3", 00:17:42.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.462 "is_configured": false, 00:17:42.462 "data_offset": 0, 00:17:42.462 "data_size": 0 00:17:42.462 }, 00:17:42.462 { 00:17:42.462 "name": "BaseBdev4", 00:17:42.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.462 "is_configured": false, 00:17:42.462 "data_offset": 0, 00:17:42.462 "data_size": 0 00:17:42.462 } 00:17:42.462 ] 00:17:42.462 }' 00:17:42.462 13:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.462 13:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:43.029 13:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:43.286 [2024-07-15 13:40:30.658464] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:43.286 [2024-07-15 13:40:30.658490] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d2f70 name Existed_Raid, state configuring 00:17:43.286 13:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:43.286 [2024-07-15 13:40:30.834935] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:43.286 [2024-07-15 13:40:30.834962] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:43.286 [2024-07-15 13:40:30.834968] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:43.286 [2024-07-15 13:40:30.834975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:43.286 [2024-07-15 13:40:30.834981] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:43.287 [2024-07-15 13:40:30.834987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:43.287 [2024-07-15 13:40:30.834993] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:43.287 [2024-07-15 13:40:30.835004] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:43.287 13:40:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:43.544 [2024-07-15 13:40:31.008117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:43.544 BaseBdev1 00:17:43.544 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:43.544 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:43.544 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:43.544 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:43.544 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:43.544 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:43.544 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:43.802 [ 00:17:43.802 { 00:17:43.802 "name": "BaseBdev1", 00:17:43.802 "aliases": [ 00:17:43.802 "d64a084d-7af3-4768-9053-01884810e5b5" 00:17:43.802 ], 00:17:43.802 "product_name": "Malloc disk", 00:17:43.802 "block_size": 512, 00:17:43.802 "num_blocks": 65536, 00:17:43.802 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:43.802 "assigned_rate_limits": { 00:17:43.802 "rw_ios_per_sec": 0, 00:17:43.802 "rw_mbytes_per_sec": 0, 00:17:43.802 "r_mbytes_per_sec": 0, 00:17:43.802 "w_mbytes_per_sec": 0 00:17:43.802 }, 00:17:43.802 "claimed": true, 00:17:43.802 "claim_type": "exclusive_write", 00:17:43.802 "zoned": false, 00:17:43.802 "supported_io_types": { 00:17:43.802 "read": true, 00:17:43.802 "write": true, 00:17:43.802 "unmap": true, 00:17:43.802 "flush": true, 00:17:43.802 "reset": true, 00:17:43.802 "nvme_admin": false, 00:17:43.802 "nvme_io": false, 00:17:43.802 "nvme_io_md": false, 00:17:43.802 "write_zeroes": true, 00:17:43.802 "zcopy": true, 00:17:43.802 "get_zone_info": false, 00:17:43.802 "zone_management": false, 00:17:43.802 "zone_append": false, 00:17:43.802 "compare": false, 00:17:43.802 "compare_and_write": false, 00:17:43.802 "abort": true, 00:17:43.802 "seek_hole": false, 00:17:43.802 "seek_data": false, 00:17:43.802 "copy": true, 00:17:43.802 "nvme_iov_md": false 00:17:43.802 }, 00:17:43.802 "memory_domains": [ 00:17:43.802 { 00:17:43.802 "dma_device_id": "system", 00:17:43.802 "dma_device_type": 1 00:17:43.802 }, 00:17:43.802 { 00:17:43.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.802 "dma_device_type": 2 00:17:43.802 } 00:17:43.802 ], 00:17:43.802 "driver_specific": {} 00:17:43.802 } 00:17:43.802 ] 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.802 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.060 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.060 "name": "Existed_Raid", 00:17:44.060 "uuid": "4403787f-c658-407a-9c21-b28e2171f88a", 00:17:44.060 "strip_size_kb": 0, 00:17:44.060 "state": "configuring", 00:17:44.060 "raid_level": "raid1", 00:17:44.060 "superblock": true, 00:17:44.060 "num_base_bdevs": 4, 00:17:44.060 "num_base_bdevs_discovered": 1, 00:17:44.060 "num_base_bdevs_operational": 4, 00:17:44.060 "base_bdevs_list": [ 00:17:44.060 { 00:17:44.060 "name": "BaseBdev1", 00:17:44.060 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:44.060 "is_configured": true, 00:17:44.060 "data_offset": 2048, 00:17:44.060 "data_size": 63488 00:17:44.060 }, 00:17:44.060 { 00:17:44.060 "name": "BaseBdev2", 00:17:44.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.060 "is_configured": false, 00:17:44.060 "data_offset": 0, 00:17:44.060 "data_size": 0 00:17:44.060 }, 00:17:44.060 { 00:17:44.060 "name": "BaseBdev3", 00:17:44.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.060 "is_configured": false, 00:17:44.060 "data_offset": 0, 00:17:44.060 "data_size": 0 00:17:44.060 }, 00:17:44.060 { 00:17:44.060 "name": "BaseBdev4", 00:17:44.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.060 "is_configured": false, 00:17:44.061 "data_offset": 0, 00:17:44.061 "data_size": 0 00:17:44.061 } 00:17:44.061 ] 00:17:44.061 }' 00:17:44.061 13:40:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.061 13:40:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:44.625 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:44.625 [2024-07-15 13:40:32.171165] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:44.625 [2024-07-15 13:40:32.171201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d27e0 name Existed_Raid, state configuring 00:17:44.625 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:44.883 [2024-07-15 13:40:32.347661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:44.883 [2024-07-15 13:40:32.348756] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:44.883 [2024-07-15 13:40:32.348784] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:44.883 [2024-07-15 13:40:32.348791] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:44.883 [2024-07-15 13:40:32.348799] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:44.883 [2024-07-15 13:40:32.348805] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:44.883 [2024-07-15 13:40:32.348812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.883 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.142 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.142 "name": "Existed_Raid", 00:17:45.142 "uuid": "1bb65752-616f-4a1c-813f-bc4043b84456", 00:17:45.142 "strip_size_kb": 0, 00:17:45.142 "state": "configuring", 00:17:45.142 "raid_level": "raid1", 00:17:45.142 "superblock": true, 00:17:45.142 "num_base_bdevs": 4, 00:17:45.142 "num_base_bdevs_discovered": 1, 00:17:45.142 "num_base_bdevs_operational": 4, 00:17:45.142 "base_bdevs_list": [ 00:17:45.142 { 00:17:45.142 "name": "BaseBdev1", 00:17:45.142 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:45.142 "is_configured": true, 00:17:45.142 "data_offset": 2048, 00:17:45.142 "data_size": 63488 00:17:45.142 }, 00:17:45.142 { 00:17:45.142 "name": "BaseBdev2", 00:17:45.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.142 "is_configured": false, 00:17:45.142 "data_offset": 0, 00:17:45.142 "data_size": 0 00:17:45.142 }, 00:17:45.142 { 00:17:45.142 "name": "BaseBdev3", 00:17:45.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.142 "is_configured": false, 00:17:45.142 "data_offset": 0, 00:17:45.142 "data_size": 0 00:17:45.142 }, 00:17:45.142 { 00:17:45.142 "name": "BaseBdev4", 00:17:45.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.142 "is_configured": false, 00:17:45.142 "data_offset": 0, 00:17:45.142 "data_size": 0 00:17:45.142 } 00:17:45.142 ] 00:17:45.142 }' 00:17:45.142 13:40:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.142 13:40:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:45.726 [2024-07-15 13:40:33.204682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:45.726 BaseBdev2 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:45.726 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:45.999 [ 00:17:45.999 { 00:17:45.999 "name": "BaseBdev2", 00:17:45.999 "aliases": [ 00:17:45.999 "b1d22a12-c1e7-4dd5-a33d-f0456a51d039" 00:17:45.999 ], 00:17:45.999 "product_name": "Malloc disk", 00:17:45.999 "block_size": 512, 00:17:45.999 "num_blocks": 65536, 00:17:45.999 "uuid": "b1d22a12-c1e7-4dd5-a33d-f0456a51d039", 00:17:45.999 "assigned_rate_limits": { 00:17:45.999 "rw_ios_per_sec": 0, 00:17:45.999 "rw_mbytes_per_sec": 0, 00:17:45.999 "r_mbytes_per_sec": 0, 00:17:45.999 "w_mbytes_per_sec": 0 00:17:45.999 }, 00:17:45.999 "claimed": true, 00:17:45.999 "claim_type": "exclusive_write", 00:17:45.999 "zoned": false, 00:17:45.999 "supported_io_types": { 00:17:45.999 "read": true, 00:17:45.999 "write": true, 00:17:45.999 "unmap": true, 00:17:45.999 "flush": true, 00:17:45.999 "reset": true, 00:17:45.999 "nvme_admin": false, 00:17:45.999 "nvme_io": false, 00:17:45.999 "nvme_io_md": false, 00:17:45.999 "write_zeroes": true, 00:17:45.999 "zcopy": true, 00:17:45.999 "get_zone_info": false, 00:17:45.999 "zone_management": false, 00:17:45.999 "zone_append": false, 00:17:45.999 "compare": false, 00:17:45.999 "compare_and_write": false, 00:17:45.999 "abort": true, 00:17:45.999 "seek_hole": false, 00:17:45.999 "seek_data": false, 00:17:45.999 "copy": true, 00:17:45.999 "nvme_iov_md": false 00:17:45.999 }, 00:17:45.999 "memory_domains": [ 00:17:45.999 { 00:17:45.999 "dma_device_id": "system", 00:17:45.999 "dma_device_type": 1 00:17:45.999 }, 00:17:45.999 { 00:17:45.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.999 "dma_device_type": 2 00:17:45.999 } 00:17:45.999 ], 00:17:45.999 "driver_specific": {} 00:17:45.999 } 00:17:45.999 ] 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.999 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:46.256 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.256 "name": "Existed_Raid", 00:17:46.256 "uuid": "1bb65752-616f-4a1c-813f-bc4043b84456", 00:17:46.256 "strip_size_kb": 0, 00:17:46.256 "state": "configuring", 00:17:46.256 "raid_level": "raid1", 00:17:46.256 "superblock": true, 00:17:46.256 "num_base_bdevs": 4, 00:17:46.256 "num_base_bdevs_discovered": 2, 00:17:46.256 "num_base_bdevs_operational": 4, 00:17:46.256 "base_bdevs_list": [ 00:17:46.256 { 00:17:46.256 "name": "BaseBdev1", 00:17:46.256 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:46.256 "is_configured": true, 00:17:46.256 "data_offset": 2048, 00:17:46.256 "data_size": 63488 00:17:46.256 }, 00:17:46.256 { 00:17:46.256 "name": "BaseBdev2", 00:17:46.256 "uuid": "b1d22a12-c1e7-4dd5-a33d-f0456a51d039", 00:17:46.256 "is_configured": true, 00:17:46.256 "data_offset": 2048, 00:17:46.256 "data_size": 63488 00:17:46.256 }, 00:17:46.256 { 00:17:46.256 "name": "BaseBdev3", 00:17:46.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:46.256 "is_configured": false, 00:17:46.256 "data_offset": 0, 00:17:46.256 "data_size": 0 00:17:46.256 }, 00:17:46.257 { 00:17:46.257 "name": "BaseBdev4", 00:17:46.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:46.257 "is_configured": false, 00:17:46.257 "data_offset": 0, 00:17:46.257 "data_size": 0 00:17:46.257 } 00:17:46.257 ] 00:17:46.257 }' 00:17:46.257 13:40:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.257 13:40:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:46.822 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:46.822 [2024-07-15 13:40:34.423934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:46.822 BaseBdev3 00:17:46.822 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:46.822 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:46.822 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:46.822 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:46.822 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:46.822 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:46.823 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:47.080 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:47.337 [ 00:17:47.337 { 00:17:47.337 "name": "BaseBdev3", 00:17:47.337 "aliases": [ 00:17:47.337 "9189d051-5f52-44c7-8c6b-4c3640c05a6e" 00:17:47.337 ], 00:17:47.337 "product_name": "Malloc disk", 00:17:47.337 "block_size": 512, 00:17:47.337 "num_blocks": 65536, 00:17:47.337 "uuid": "9189d051-5f52-44c7-8c6b-4c3640c05a6e", 00:17:47.337 "assigned_rate_limits": { 00:17:47.337 "rw_ios_per_sec": 0, 00:17:47.337 "rw_mbytes_per_sec": 0, 00:17:47.337 "r_mbytes_per_sec": 0, 00:17:47.337 "w_mbytes_per_sec": 0 00:17:47.337 }, 00:17:47.337 "claimed": true, 00:17:47.337 "claim_type": "exclusive_write", 00:17:47.337 "zoned": false, 00:17:47.337 "supported_io_types": { 00:17:47.337 "read": true, 00:17:47.337 "write": true, 00:17:47.337 "unmap": true, 00:17:47.337 "flush": true, 00:17:47.337 "reset": true, 00:17:47.337 "nvme_admin": false, 00:17:47.337 "nvme_io": false, 00:17:47.337 "nvme_io_md": false, 00:17:47.337 "write_zeroes": true, 00:17:47.337 "zcopy": true, 00:17:47.337 "get_zone_info": false, 00:17:47.337 "zone_management": false, 00:17:47.337 "zone_append": false, 00:17:47.337 "compare": false, 00:17:47.337 "compare_and_write": false, 00:17:47.337 "abort": true, 00:17:47.337 "seek_hole": false, 00:17:47.337 "seek_data": false, 00:17:47.337 "copy": true, 00:17:47.337 "nvme_iov_md": false 00:17:47.337 }, 00:17:47.337 "memory_domains": [ 00:17:47.337 { 00:17:47.337 "dma_device_id": "system", 00:17:47.337 "dma_device_type": 1 00:17:47.337 }, 00:17:47.337 { 00:17:47.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.337 "dma_device_type": 2 00:17:47.337 } 00:17:47.337 ], 00:17:47.337 "driver_specific": {} 00:17:47.337 } 00:17:47.337 ] 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.337 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:47.595 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.595 "name": "Existed_Raid", 00:17:47.595 "uuid": "1bb65752-616f-4a1c-813f-bc4043b84456", 00:17:47.595 "strip_size_kb": 0, 00:17:47.595 "state": "configuring", 00:17:47.595 "raid_level": "raid1", 00:17:47.595 "superblock": true, 00:17:47.595 "num_base_bdevs": 4, 00:17:47.595 "num_base_bdevs_discovered": 3, 00:17:47.595 "num_base_bdevs_operational": 4, 00:17:47.595 "base_bdevs_list": [ 00:17:47.595 { 00:17:47.595 "name": "BaseBdev1", 00:17:47.595 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:47.595 "is_configured": true, 00:17:47.595 "data_offset": 2048, 00:17:47.595 "data_size": 63488 00:17:47.595 }, 00:17:47.595 { 00:17:47.595 "name": "BaseBdev2", 00:17:47.595 "uuid": "b1d22a12-c1e7-4dd5-a33d-f0456a51d039", 00:17:47.595 "is_configured": true, 00:17:47.595 "data_offset": 2048, 00:17:47.595 "data_size": 63488 00:17:47.595 }, 00:17:47.595 { 00:17:47.595 "name": "BaseBdev3", 00:17:47.595 "uuid": "9189d051-5f52-44c7-8c6b-4c3640c05a6e", 00:17:47.595 "is_configured": true, 00:17:47.595 "data_offset": 2048, 00:17:47.595 "data_size": 63488 00:17:47.595 }, 00:17:47.595 { 00:17:47.595 "name": "BaseBdev4", 00:17:47.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.595 "is_configured": false, 00:17:47.595 "data_offset": 0, 00:17:47.595 "data_size": 0 00:17:47.595 } 00:17:47.595 ] 00:17:47.595 }' 00:17:47.595 13:40:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.595 13:40:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:47.853 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:48.111 [2024-07-15 13:40:35.623116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:48.111 [2024-07-15 13:40:35.623270] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7d3840 00:17:48.111 [2024-07-15 13:40:35.623281] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:48.111 [2024-07-15 13:40:35.623408] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7d3480 00:17:48.111 [2024-07-15 13:40:35.623507] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7d3840 00:17:48.111 [2024-07-15 13:40:35.623514] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7d3840 00:17:48.111 [2024-07-15 13:40:35.623581] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:48.111 BaseBdev4 00:17:48.111 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:48.111 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:48.111 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:48.111 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:48.111 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:48.111 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:48.111 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:48.368 [ 00:17:48.368 { 00:17:48.368 "name": "BaseBdev4", 00:17:48.368 "aliases": [ 00:17:48.368 "6917365a-c112-4cbb-a03e-e75ffd033e57" 00:17:48.368 ], 00:17:48.368 "product_name": "Malloc disk", 00:17:48.368 "block_size": 512, 00:17:48.368 "num_blocks": 65536, 00:17:48.368 "uuid": "6917365a-c112-4cbb-a03e-e75ffd033e57", 00:17:48.368 "assigned_rate_limits": { 00:17:48.368 "rw_ios_per_sec": 0, 00:17:48.368 "rw_mbytes_per_sec": 0, 00:17:48.368 "r_mbytes_per_sec": 0, 00:17:48.368 "w_mbytes_per_sec": 0 00:17:48.368 }, 00:17:48.368 "claimed": true, 00:17:48.368 "claim_type": "exclusive_write", 00:17:48.368 "zoned": false, 00:17:48.368 "supported_io_types": { 00:17:48.368 "read": true, 00:17:48.368 "write": true, 00:17:48.368 "unmap": true, 00:17:48.368 "flush": true, 00:17:48.368 "reset": true, 00:17:48.368 "nvme_admin": false, 00:17:48.368 "nvme_io": false, 00:17:48.368 "nvme_io_md": false, 00:17:48.368 "write_zeroes": true, 00:17:48.368 "zcopy": true, 00:17:48.368 "get_zone_info": false, 00:17:48.368 "zone_management": false, 00:17:48.368 "zone_append": false, 00:17:48.368 "compare": false, 00:17:48.368 "compare_and_write": false, 00:17:48.368 "abort": true, 00:17:48.368 "seek_hole": false, 00:17:48.368 "seek_data": false, 00:17:48.368 "copy": true, 00:17:48.368 "nvme_iov_md": false 00:17:48.368 }, 00:17:48.368 "memory_domains": [ 00:17:48.368 { 00:17:48.368 "dma_device_id": "system", 00:17:48.368 "dma_device_type": 1 00:17:48.368 }, 00:17:48.368 { 00:17:48.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.368 "dma_device_type": 2 00:17:48.368 } 00:17:48.368 ], 00:17:48.368 "driver_specific": {} 00:17:48.368 } 00:17:48.368 ] 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.368 13:40:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:48.625 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.625 "name": "Existed_Raid", 00:17:48.625 "uuid": "1bb65752-616f-4a1c-813f-bc4043b84456", 00:17:48.625 "strip_size_kb": 0, 00:17:48.625 "state": "online", 00:17:48.625 "raid_level": "raid1", 00:17:48.625 "superblock": true, 00:17:48.625 "num_base_bdevs": 4, 00:17:48.625 "num_base_bdevs_discovered": 4, 00:17:48.625 "num_base_bdevs_operational": 4, 00:17:48.625 "base_bdevs_list": [ 00:17:48.625 { 00:17:48.625 "name": "BaseBdev1", 00:17:48.625 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:48.625 "is_configured": true, 00:17:48.625 "data_offset": 2048, 00:17:48.625 "data_size": 63488 00:17:48.625 }, 00:17:48.625 { 00:17:48.625 "name": "BaseBdev2", 00:17:48.625 "uuid": "b1d22a12-c1e7-4dd5-a33d-f0456a51d039", 00:17:48.625 "is_configured": true, 00:17:48.625 "data_offset": 2048, 00:17:48.625 "data_size": 63488 00:17:48.625 }, 00:17:48.625 { 00:17:48.625 "name": "BaseBdev3", 00:17:48.625 "uuid": "9189d051-5f52-44c7-8c6b-4c3640c05a6e", 00:17:48.625 "is_configured": true, 00:17:48.625 "data_offset": 2048, 00:17:48.625 "data_size": 63488 00:17:48.625 }, 00:17:48.625 { 00:17:48.625 "name": "BaseBdev4", 00:17:48.625 "uuid": "6917365a-c112-4cbb-a03e-e75ffd033e57", 00:17:48.625 "is_configured": true, 00:17:48.625 "data_offset": 2048, 00:17:48.625 "data_size": 63488 00:17:48.625 } 00:17:48.625 ] 00:17:48.625 }' 00:17:48.625 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.625 13:40:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:49.189 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:49.189 [2024-07-15 13:40:36.806413] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:49.446 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:49.446 "name": "Existed_Raid", 00:17:49.446 "aliases": [ 00:17:49.446 "1bb65752-616f-4a1c-813f-bc4043b84456" 00:17:49.446 ], 00:17:49.446 "product_name": "Raid Volume", 00:17:49.446 "block_size": 512, 00:17:49.446 "num_blocks": 63488, 00:17:49.446 "uuid": "1bb65752-616f-4a1c-813f-bc4043b84456", 00:17:49.446 "assigned_rate_limits": { 00:17:49.446 "rw_ios_per_sec": 0, 00:17:49.446 "rw_mbytes_per_sec": 0, 00:17:49.446 "r_mbytes_per_sec": 0, 00:17:49.446 "w_mbytes_per_sec": 0 00:17:49.446 }, 00:17:49.446 "claimed": false, 00:17:49.446 "zoned": false, 00:17:49.446 "supported_io_types": { 00:17:49.446 "read": true, 00:17:49.446 "write": true, 00:17:49.446 "unmap": false, 00:17:49.446 "flush": false, 00:17:49.446 "reset": true, 00:17:49.446 "nvme_admin": false, 00:17:49.446 "nvme_io": false, 00:17:49.446 "nvme_io_md": false, 00:17:49.446 "write_zeroes": true, 00:17:49.446 "zcopy": false, 00:17:49.446 "get_zone_info": false, 00:17:49.446 "zone_management": false, 00:17:49.446 "zone_append": false, 00:17:49.446 "compare": false, 00:17:49.446 "compare_and_write": false, 00:17:49.446 "abort": false, 00:17:49.446 "seek_hole": false, 00:17:49.446 "seek_data": false, 00:17:49.446 "copy": false, 00:17:49.446 "nvme_iov_md": false 00:17:49.446 }, 00:17:49.446 "memory_domains": [ 00:17:49.446 { 00:17:49.446 "dma_device_id": "system", 00:17:49.446 "dma_device_type": 1 00:17:49.446 }, 00:17:49.446 { 00:17:49.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.447 "dma_device_type": 2 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "dma_device_id": "system", 00:17:49.447 "dma_device_type": 1 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.447 "dma_device_type": 2 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "dma_device_id": "system", 00:17:49.447 "dma_device_type": 1 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.447 "dma_device_type": 2 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "dma_device_id": "system", 00:17:49.447 "dma_device_type": 1 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.447 "dma_device_type": 2 00:17:49.447 } 00:17:49.447 ], 00:17:49.447 "driver_specific": { 00:17:49.447 "raid": { 00:17:49.447 "uuid": "1bb65752-616f-4a1c-813f-bc4043b84456", 00:17:49.447 "strip_size_kb": 0, 00:17:49.447 "state": "online", 00:17:49.447 "raid_level": "raid1", 00:17:49.447 "superblock": true, 00:17:49.447 "num_base_bdevs": 4, 00:17:49.447 "num_base_bdevs_discovered": 4, 00:17:49.447 "num_base_bdevs_operational": 4, 00:17:49.447 "base_bdevs_list": [ 00:17:49.447 { 00:17:49.447 "name": "BaseBdev1", 00:17:49.447 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:49.447 "is_configured": true, 00:17:49.447 "data_offset": 2048, 00:17:49.447 "data_size": 63488 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "name": "BaseBdev2", 00:17:49.447 "uuid": "b1d22a12-c1e7-4dd5-a33d-f0456a51d039", 00:17:49.447 "is_configured": true, 00:17:49.447 "data_offset": 2048, 00:17:49.447 "data_size": 63488 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "name": "BaseBdev3", 00:17:49.447 "uuid": "9189d051-5f52-44c7-8c6b-4c3640c05a6e", 00:17:49.447 "is_configured": true, 00:17:49.447 "data_offset": 2048, 00:17:49.447 "data_size": 63488 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "name": "BaseBdev4", 00:17:49.447 "uuid": "6917365a-c112-4cbb-a03e-e75ffd033e57", 00:17:49.447 "is_configured": true, 00:17:49.447 "data_offset": 2048, 00:17:49.447 "data_size": 63488 00:17:49.447 } 00:17:49.447 ] 00:17:49.447 } 00:17:49.447 } 00:17:49.447 }' 00:17:49.447 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:49.447 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:49.447 BaseBdev2 00:17:49.447 BaseBdev3 00:17:49.447 BaseBdev4' 00:17:49.447 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:49.447 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:49.447 13:40:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:49.447 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:49.447 "name": "BaseBdev1", 00:17:49.447 "aliases": [ 00:17:49.447 "d64a084d-7af3-4768-9053-01884810e5b5" 00:17:49.447 ], 00:17:49.447 "product_name": "Malloc disk", 00:17:49.447 "block_size": 512, 00:17:49.447 "num_blocks": 65536, 00:17:49.447 "uuid": "d64a084d-7af3-4768-9053-01884810e5b5", 00:17:49.447 "assigned_rate_limits": { 00:17:49.447 "rw_ios_per_sec": 0, 00:17:49.447 "rw_mbytes_per_sec": 0, 00:17:49.447 "r_mbytes_per_sec": 0, 00:17:49.447 "w_mbytes_per_sec": 0 00:17:49.447 }, 00:17:49.447 "claimed": true, 00:17:49.447 "claim_type": "exclusive_write", 00:17:49.447 "zoned": false, 00:17:49.447 "supported_io_types": { 00:17:49.447 "read": true, 00:17:49.447 "write": true, 00:17:49.447 "unmap": true, 00:17:49.447 "flush": true, 00:17:49.447 "reset": true, 00:17:49.447 "nvme_admin": false, 00:17:49.447 "nvme_io": false, 00:17:49.447 "nvme_io_md": false, 00:17:49.447 "write_zeroes": true, 00:17:49.447 "zcopy": true, 00:17:49.447 "get_zone_info": false, 00:17:49.447 "zone_management": false, 00:17:49.447 "zone_append": false, 00:17:49.447 "compare": false, 00:17:49.447 "compare_and_write": false, 00:17:49.447 "abort": true, 00:17:49.447 "seek_hole": false, 00:17:49.447 "seek_data": false, 00:17:49.447 "copy": true, 00:17:49.447 "nvme_iov_md": false 00:17:49.447 }, 00:17:49.447 "memory_domains": [ 00:17:49.447 { 00:17:49.447 "dma_device_id": "system", 00:17:49.447 "dma_device_type": 1 00:17:49.447 }, 00:17:49.447 { 00:17:49.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.447 "dma_device_type": 2 00:17:49.447 } 00:17:49.447 ], 00:17:49.447 "driver_specific": {} 00:17:49.447 }' 00:17:49.447 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.704 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.960 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.960 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.960 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:49.960 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:49.960 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:49.960 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:49.960 "name": "BaseBdev2", 00:17:49.960 "aliases": [ 00:17:49.960 "b1d22a12-c1e7-4dd5-a33d-f0456a51d039" 00:17:49.960 ], 00:17:49.960 "product_name": "Malloc disk", 00:17:49.960 "block_size": 512, 00:17:49.961 "num_blocks": 65536, 00:17:49.961 "uuid": "b1d22a12-c1e7-4dd5-a33d-f0456a51d039", 00:17:49.961 "assigned_rate_limits": { 00:17:49.961 "rw_ios_per_sec": 0, 00:17:49.961 "rw_mbytes_per_sec": 0, 00:17:49.961 "r_mbytes_per_sec": 0, 00:17:49.961 "w_mbytes_per_sec": 0 00:17:49.961 }, 00:17:49.961 "claimed": true, 00:17:49.961 "claim_type": "exclusive_write", 00:17:49.961 "zoned": false, 00:17:49.961 "supported_io_types": { 00:17:49.961 "read": true, 00:17:49.961 "write": true, 00:17:49.961 "unmap": true, 00:17:49.961 "flush": true, 00:17:49.961 "reset": true, 00:17:49.961 "nvme_admin": false, 00:17:49.961 "nvme_io": false, 00:17:49.961 "nvme_io_md": false, 00:17:49.961 "write_zeroes": true, 00:17:49.961 "zcopy": true, 00:17:49.961 "get_zone_info": false, 00:17:49.961 "zone_management": false, 00:17:49.961 "zone_append": false, 00:17:49.961 "compare": false, 00:17:49.961 "compare_and_write": false, 00:17:49.961 "abort": true, 00:17:49.961 "seek_hole": false, 00:17:49.961 "seek_data": false, 00:17:49.961 "copy": true, 00:17:49.961 "nvme_iov_md": false 00:17:49.961 }, 00:17:49.961 "memory_domains": [ 00:17:49.961 { 00:17:49.961 "dma_device_id": "system", 00:17:49.961 "dma_device_type": 1 00:17:49.961 }, 00:17:49.961 { 00:17:49.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.961 "dma_device_type": 2 00:17:49.961 } 00:17:49.961 ], 00:17:49.961 "driver_specific": {} 00:17:49.961 }' 00:17:49.961 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.961 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.217 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.473 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:50.473 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:50.473 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:50.473 13:40:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:50.473 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:50.473 "name": "BaseBdev3", 00:17:50.473 "aliases": [ 00:17:50.473 "9189d051-5f52-44c7-8c6b-4c3640c05a6e" 00:17:50.473 ], 00:17:50.473 "product_name": "Malloc disk", 00:17:50.473 "block_size": 512, 00:17:50.473 "num_blocks": 65536, 00:17:50.473 "uuid": "9189d051-5f52-44c7-8c6b-4c3640c05a6e", 00:17:50.473 "assigned_rate_limits": { 00:17:50.473 "rw_ios_per_sec": 0, 00:17:50.473 "rw_mbytes_per_sec": 0, 00:17:50.473 "r_mbytes_per_sec": 0, 00:17:50.473 "w_mbytes_per_sec": 0 00:17:50.474 }, 00:17:50.474 "claimed": true, 00:17:50.474 "claim_type": "exclusive_write", 00:17:50.474 "zoned": false, 00:17:50.474 "supported_io_types": { 00:17:50.474 "read": true, 00:17:50.474 "write": true, 00:17:50.474 "unmap": true, 00:17:50.474 "flush": true, 00:17:50.474 "reset": true, 00:17:50.474 "nvme_admin": false, 00:17:50.474 "nvme_io": false, 00:17:50.474 "nvme_io_md": false, 00:17:50.474 "write_zeroes": true, 00:17:50.474 "zcopy": true, 00:17:50.474 "get_zone_info": false, 00:17:50.474 "zone_management": false, 00:17:50.474 "zone_append": false, 00:17:50.474 "compare": false, 00:17:50.474 "compare_and_write": false, 00:17:50.474 "abort": true, 00:17:50.474 "seek_hole": false, 00:17:50.474 "seek_data": false, 00:17:50.474 "copy": true, 00:17:50.474 "nvme_iov_md": false 00:17:50.474 }, 00:17:50.474 "memory_domains": [ 00:17:50.474 { 00:17:50.474 "dma_device_id": "system", 00:17:50.474 "dma_device_type": 1 00:17:50.474 }, 00:17:50.474 { 00:17:50.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.474 "dma_device_type": 2 00:17:50.474 } 00:17:50.474 ], 00:17:50.474 "driver_specific": {} 00:17:50.474 }' 00:17:50.474 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.474 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.730 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.987 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:50.987 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:50.987 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:50.987 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:50.987 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:50.987 "name": "BaseBdev4", 00:17:50.987 "aliases": [ 00:17:50.987 "6917365a-c112-4cbb-a03e-e75ffd033e57" 00:17:50.987 ], 00:17:50.987 "product_name": "Malloc disk", 00:17:50.987 "block_size": 512, 00:17:50.987 "num_blocks": 65536, 00:17:50.987 "uuid": "6917365a-c112-4cbb-a03e-e75ffd033e57", 00:17:50.987 "assigned_rate_limits": { 00:17:50.987 "rw_ios_per_sec": 0, 00:17:50.987 "rw_mbytes_per_sec": 0, 00:17:50.987 "r_mbytes_per_sec": 0, 00:17:50.987 "w_mbytes_per_sec": 0 00:17:50.987 }, 00:17:50.987 "claimed": true, 00:17:50.987 "claim_type": "exclusive_write", 00:17:50.987 "zoned": false, 00:17:50.987 "supported_io_types": { 00:17:50.987 "read": true, 00:17:50.987 "write": true, 00:17:50.987 "unmap": true, 00:17:50.987 "flush": true, 00:17:50.987 "reset": true, 00:17:50.987 "nvme_admin": false, 00:17:50.987 "nvme_io": false, 00:17:50.987 "nvme_io_md": false, 00:17:50.987 "write_zeroes": true, 00:17:50.987 "zcopy": true, 00:17:50.987 "get_zone_info": false, 00:17:50.987 "zone_management": false, 00:17:50.987 "zone_append": false, 00:17:50.987 "compare": false, 00:17:50.987 "compare_and_write": false, 00:17:50.987 "abort": true, 00:17:50.987 "seek_hole": false, 00:17:50.987 "seek_data": false, 00:17:50.987 "copy": true, 00:17:50.987 "nvme_iov_md": false 00:17:50.987 }, 00:17:50.987 "memory_domains": [ 00:17:50.987 { 00:17:50.987 "dma_device_id": "system", 00:17:50.987 "dma_device_type": 1 00:17:50.987 }, 00:17:50.987 { 00:17:50.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.987 "dma_device_type": 2 00:17:50.987 } 00:17:50.987 ], 00:17:50.987 "driver_specific": {} 00:17:50.987 }' 00:17:50.987 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.987 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:51.245 13:40:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:51.502 [2024-07-15 13:40:39.007919] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.502 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.759 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.760 "name": "Existed_Raid", 00:17:51.760 "uuid": "1bb65752-616f-4a1c-813f-bc4043b84456", 00:17:51.760 "strip_size_kb": 0, 00:17:51.760 "state": "online", 00:17:51.760 "raid_level": "raid1", 00:17:51.760 "superblock": true, 00:17:51.760 "num_base_bdevs": 4, 00:17:51.760 "num_base_bdevs_discovered": 3, 00:17:51.760 "num_base_bdevs_operational": 3, 00:17:51.760 "base_bdevs_list": [ 00:17:51.760 { 00:17:51.760 "name": null, 00:17:51.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.760 "is_configured": false, 00:17:51.760 "data_offset": 2048, 00:17:51.760 "data_size": 63488 00:17:51.760 }, 00:17:51.760 { 00:17:51.760 "name": "BaseBdev2", 00:17:51.760 "uuid": "b1d22a12-c1e7-4dd5-a33d-f0456a51d039", 00:17:51.760 "is_configured": true, 00:17:51.760 "data_offset": 2048, 00:17:51.760 "data_size": 63488 00:17:51.760 }, 00:17:51.760 { 00:17:51.760 "name": "BaseBdev3", 00:17:51.760 "uuid": "9189d051-5f52-44c7-8c6b-4c3640c05a6e", 00:17:51.760 "is_configured": true, 00:17:51.760 "data_offset": 2048, 00:17:51.760 "data_size": 63488 00:17:51.760 }, 00:17:51.760 { 00:17:51.760 "name": "BaseBdev4", 00:17:51.760 "uuid": "6917365a-c112-4cbb-a03e-e75ffd033e57", 00:17:51.760 "is_configured": true, 00:17:51.760 "data_offset": 2048, 00:17:51.760 "data_size": 63488 00:17:51.760 } 00:17:51.760 ] 00:17:51.760 }' 00:17:51.760 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.760 13:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:52.327 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:52.328 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:52.328 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.328 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:52.328 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:52.328 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:52.328 13:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:52.587 [2024-07-15 13:40:40.031435] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:52.587 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:52.587 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:52.587 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.587 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:52.846 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:52.846 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:52.846 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:52.846 [2024-07-15 13:40:40.394507] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:52.846 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:52.846 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:52.846 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.846 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:53.105 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:53.105 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:53.105 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:53.364 [2024-07-15 13:40:40.750313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:53.364 [2024-07-15 13:40:40.750384] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:53.364 [2024-07-15 13:40:40.762225] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:53.364 [2024-07-15 13:40:40.762270] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:53.364 [2024-07-15 13:40:40.762278] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d3840 name Existed_Raid, state offline 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:53.364 13:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:53.624 BaseBdev2 00:17:53.624 13:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:53.624 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:53.624 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:53.624 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:53.624 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:53.624 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:53.624 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:53.883 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:54.142 [ 00:17:54.142 { 00:17:54.142 "name": "BaseBdev2", 00:17:54.142 "aliases": [ 00:17:54.142 "5981284b-007b-4cbe-971d-71f46af5a5b6" 00:17:54.142 ], 00:17:54.142 "product_name": "Malloc disk", 00:17:54.142 "block_size": 512, 00:17:54.142 "num_blocks": 65536, 00:17:54.142 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:17:54.142 "assigned_rate_limits": { 00:17:54.142 "rw_ios_per_sec": 0, 00:17:54.142 "rw_mbytes_per_sec": 0, 00:17:54.142 "r_mbytes_per_sec": 0, 00:17:54.142 "w_mbytes_per_sec": 0 00:17:54.142 }, 00:17:54.142 "claimed": false, 00:17:54.142 "zoned": false, 00:17:54.142 "supported_io_types": { 00:17:54.142 "read": true, 00:17:54.142 "write": true, 00:17:54.142 "unmap": true, 00:17:54.142 "flush": true, 00:17:54.142 "reset": true, 00:17:54.142 "nvme_admin": false, 00:17:54.142 "nvme_io": false, 00:17:54.142 "nvme_io_md": false, 00:17:54.142 "write_zeroes": true, 00:17:54.142 "zcopy": true, 00:17:54.142 "get_zone_info": false, 00:17:54.142 "zone_management": false, 00:17:54.142 "zone_append": false, 00:17:54.142 "compare": false, 00:17:54.142 "compare_and_write": false, 00:17:54.142 "abort": true, 00:17:54.142 "seek_hole": false, 00:17:54.142 "seek_data": false, 00:17:54.142 "copy": true, 00:17:54.142 "nvme_iov_md": false 00:17:54.142 }, 00:17:54.142 "memory_domains": [ 00:17:54.142 { 00:17:54.142 "dma_device_id": "system", 00:17:54.142 "dma_device_type": 1 00:17:54.142 }, 00:17:54.142 { 00:17:54.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.142 "dma_device_type": 2 00:17:54.142 } 00:17:54.142 ], 00:17:54.142 "driver_specific": {} 00:17:54.142 } 00:17:54.142 ] 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:54.142 BaseBdev3 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:54.142 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:54.143 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:54.143 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:54.143 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:54.401 13:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:54.401 [ 00:17:54.401 { 00:17:54.401 "name": "BaseBdev3", 00:17:54.401 "aliases": [ 00:17:54.401 "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e" 00:17:54.401 ], 00:17:54.401 "product_name": "Malloc disk", 00:17:54.401 "block_size": 512, 00:17:54.401 "num_blocks": 65536, 00:17:54.401 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:17:54.401 "assigned_rate_limits": { 00:17:54.401 "rw_ios_per_sec": 0, 00:17:54.401 "rw_mbytes_per_sec": 0, 00:17:54.401 "r_mbytes_per_sec": 0, 00:17:54.401 "w_mbytes_per_sec": 0 00:17:54.401 }, 00:17:54.401 "claimed": false, 00:17:54.401 "zoned": false, 00:17:54.401 "supported_io_types": { 00:17:54.401 "read": true, 00:17:54.401 "write": true, 00:17:54.401 "unmap": true, 00:17:54.401 "flush": true, 00:17:54.402 "reset": true, 00:17:54.402 "nvme_admin": false, 00:17:54.402 "nvme_io": false, 00:17:54.402 "nvme_io_md": false, 00:17:54.402 "write_zeroes": true, 00:17:54.402 "zcopy": true, 00:17:54.402 "get_zone_info": false, 00:17:54.402 "zone_management": false, 00:17:54.402 "zone_append": false, 00:17:54.402 "compare": false, 00:17:54.402 "compare_and_write": false, 00:17:54.402 "abort": true, 00:17:54.402 "seek_hole": false, 00:17:54.402 "seek_data": false, 00:17:54.402 "copy": true, 00:17:54.402 "nvme_iov_md": false 00:17:54.402 }, 00:17:54.402 "memory_domains": [ 00:17:54.402 { 00:17:54.402 "dma_device_id": "system", 00:17:54.402 "dma_device_type": 1 00:17:54.402 }, 00:17:54.402 { 00:17:54.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.402 "dma_device_type": 2 00:17:54.402 } 00:17:54.402 ], 00:17:54.402 "driver_specific": {} 00:17:54.402 } 00:17:54.402 ] 00:17:54.402 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:54.402 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:54.402 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:54.402 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:54.660 BaseBdev4 00:17:54.660 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:54.660 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:54.660 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:54.661 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:54.661 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:54.661 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:54.661 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:54.919 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:54.919 [ 00:17:54.919 { 00:17:54.919 "name": "BaseBdev4", 00:17:54.919 "aliases": [ 00:17:54.919 "3478ca77-7aa1-4669-91eb-d46c649f20e7" 00:17:54.919 ], 00:17:54.919 "product_name": "Malloc disk", 00:17:54.919 "block_size": 512, 00:17:54.919 "num_blocks": 65536, 00:17:54.919 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:17:54.919 "assigned_rate_limits": { 00:17:54.919 "rw_ios_per_sec": 0, 00:17:54.919 "rw_mbytes_per_sec": 0, 00:17:54.919 "r_mbytes_per_sec": 0, 00:17:54.919 "w_mbytes_per_sec": 0 00:17:54.919 }, 00:17:54.919 "claimed": false, 00:17:54.919 "zoned": false, 00:17:54.919 "supported_io_types": { 00:17:54.919 "read": true, 00:17:54.919 "write": true, 00:17:54.919 "unmap": true, 00:17:54.919 "flush": true, 00:17:54.919 "reset": true, 00:17:54.919 "nvme_admin": false, 00:17:54.919 "nvme_io": false, 00:17:54.919 "nvme_io_md": false, 00:17:54.919 "write_zeroes": true, 00:17:54.919 "zcopy": true, 00:17:54.919 "get_zone_info": false, 00:17:54.919 "zone_management": false, 00:17:54.919 "zone_append": false, 00:17:54.919 "compare": false, 00:17:54.919 "compare_and_write": false, 00:17:54.919 "abort": true, 00:17:54.919 "seek_hole": false, 00:17:54.919 "seek_data": false, 00:17:54.919 "copy": true, 00:17:54.919 "nvme_iov_md": false 00:17:54.919 }, 00:17:54.920 "memory_domains": [ 00:17:54.920 { 00:17:54.920 "dma_device_id": "system", 00:17:54.920 "dma_device_type": 1 00:17:54.920 }, 00:17:54.920 { 00:17:54.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.920 "dma_device_type": 2 00:17:54.920 } 00:17:54.920 ], 00:17:54.920 "driver_specific": {} 00:17:54.920 } 00:17:54.920 ] 00:17:54.920 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:54.920 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:54.920 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:54.920 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:55.179 [2024-07-15 13:40:42.664029] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:55.179 [2024-07-15 13:40:42.664068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:55.179 [2024-07-15 13:40:42.664083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:55.179 [2024-07-15 13:40:42.665108] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:55.179 [2024-07-15 13:40:42.665151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.179 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.437 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.437 "name": "Existed_Raid", 00:17:55.437 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:17:55.437 "strip_size_kb": 0, 00:17:55.437 "state": "configuring", 00:17:55.437 "raid_level": "raid1", 00:17:55.437 "superblock": true, 00:17:55.437 "num_base_bdevs": 4, 00:17:55.437 "num_base_bdevs_discovered": 3, 00:17:55.437 "num_base_bdevs_operational": 4, 00:17:55.437 "base_bdevs_list": [ 00:17:55.437 { 00:17:55.437 "name": "BaseBdev1", 00:17:55.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.437 "is_configured": false, 00:17:55.437 "data_offset": 0, 00:17:55.437 "data_size": 0 00:17:55.437 }, 00:17:55.437 { 00:17:55.437 "name": "BaseBdev2", 00:17:55.437 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:17:55.437 "is_configured": true, 00:17:55.437 "data_offset": 2048, 00:17:55.437 "data_size": 63488 00:17:55.437 }, 00:17:55.437 { 00:17:55.437 "name": "BaseBdev3", 00:17:55.437 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:17:55.437 "is_configured": true, 00:17:55.437 "data_offset": 2048, 00:17:55.437 "data_size": 63488 00:17:55.437 }, 00:17:55.437 { 00:17:55.437 "name": "BaseBdev4", 00:17:55.437 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:17:55.437 "is_configured": true, 00:17:55.437 "data_offset": 2048, 00:17:55.437 "data_size": 63488 00:17:55.437 } 00:17:55.437 ] 00:17:55.437 }' 00:17:55.437 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.437 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:56.004 [2024-07-15 13:40:43.486119] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.004 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.262 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.262 "name": "Existed_Raid", 00:17:56.262 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:17:56.262 "strip_size_kb": 0, 00:17:56.262 "state": "configuring", 00:17:56.262 "raid_level": "raid1", 00:17:56.262 "superblock": true, 00:17:56.262 "num_base_bdevs": 4, 00:17:56.262 "num_base_bdevs_discovered": 2, 00:17:56.262 "num_base_bdevs_operational": 4, 00:17:56.262 "base_bdevs_list": [ 00:17:56.262 { 00:17:56.262 "name": "BaseBdev1", 00:17:56.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.262 "is_configured": false, 00:17:56.262 "data_offset": 0, 00:17:56.262 "data_size": 0 00:17:56.262 }, 00:17:56.262 { 00:17:56.262 "name": null, 00:17:56.262 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:17:56.262 "is_configured": false, 00:17:56.262 "data_offset": 2048, 00:17:56.262 "data_size": 63488 00:17:56.262 }, 00:17:56.262 { 00:17:56.262 "name": "BaseBdev3", 00:17:56.262 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:17:56.262 "is_configured": true, 00:17:56.262 "data_offset": 2048, 00:17:56.262 "data_size": 63488 00:17:56.262 }, 00:17:56.262 { 00:17:56.262 "name": "BaseBdev4", 00:17:56.262 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:17:56.262 "is_configured": true, 00:17:56.262 "data_offset": 2048, 00:17:56.262 "data_size": 63488 00:17:56.262 } 00:17:56.262 ] 00:17:56.262 }' 00:17:56.262 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.262 13:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.829 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:56.829 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.829 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:56.829 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:57.087 [2024-07-15 13:40:44.500848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:57.087 BaseBdev1 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:57.087 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:57.346 [ 00:17:57.346 { 00:17:57.346 "name": "BaseBdev1", 00:17:57.346 "aliases": [ 00:17:57.346 "ed4f0d2d-5b7a-445a-8d78-316bd45085fd" 00:17:57.346 ], 00:17:57.346 "product_name": "Malloc disk", 00:17:57.346 "block_size": 512, 00:17:57.346 "num_blocks": 65536, 00:17:57.346 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:17:57.346 "assigned_rate_limits": { 00:17:57.346 "rw_ios_per_sec": 0, 00:17:57.346 "rw_mbytes_per_sec": 0, 00:17:57.346 "r_mbytes_per_sec": 0, 00:17:57.346 "w_mbytes_per_sec": 0 00:17:57.346 }, 00:17:57.346 "claimed": true, 00:17:57.346 "claim_type": "exclusive_write", 00:17:57.346 "zoned": false, 00:17:57.346 "supported_io_types": { 00:17:57.346 "read": true, 00:17:57.346 "write": true, 00:17:57.346 "unmap": true, 00:17:57.346 "flush": true, 00:17:57.346 "reset": true, 00:17:57.346 "nvme_admin": false, 00:17:57.346 "nvme_io": false, 00:17:57.346 "nvme_io_md": false, 00:17:57.346 "write_zeroes": true, 00:17:57.346 "zcopy": true, 00:17:57.346 "get_zone_info": false, 00:17:57.346 "zone_management": false, 00:17:57.346 "zone_append": false, 00:17:57.346 "compare": false, 00:17:57.346 "compare_and_write": false, 00:17:57.346 "abort": true, 00:17:57.346 "seek_hole": false, 00:17:57.346 "seek_data": false, 00:17:57.346 "copy": true, 00:17:57.346 "nvme_iov_md": false 00:17:57.346 }, 00:17:57.346 "memory_domains": [ 00:17:57.346 { 00:17:57.346 "dma_device_id": "system", 00:17:57.346 "dma_device_type": 1 00:17:57.346 }, 00:17:57.346 { 00:17:57.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.346 "dma_device_type": 2 00:17:57.346 } 00:17:57.346 ], 00:17:57.346 "driver_specific": {} 00:17:57.346 } 00:17:57.346 ] 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.346 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.605 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.605 "name": "Existed_Raid", 00:17:57.605 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:17:57.605 "strip_size_kb": 0, 00:17:57.605 "state": "configuring", 00:17:57.605 "raid_level": "raid1", 00:17:57.605 "superblock": true, 00:17:57.605 "num_base_bdevs": 4, 00:17:57.605 "num_base_bdevs_discovered": 3, 00:17:57.605 "num_base_bdevs_operational": 4, 00:17:57.605 "base_bdevs_list": [ 00:17:57.605 { 00:17:57.605 "name": "BaseBdev1", 00:17:57.605 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:17:57.605 "is_configured": true, 00:17:57.605 "data_offset": 2048, 00:17:57.605 "data_size": 63488 00:17:57.605 }, 00:17:57.605 { 00:17:57.605 "name": null, 00:17:57.605 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:17:57.605 "is_configured": false, 00:17:57.605 "data_offset": 2048, 00:17:57.605 "data_size": 63488 00:17:57.605 }, 00:17:57.605 { 00:17:57.605 "name": "BaseBdev3", 00:17:57.605 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:17:57.605 "is_configured": true, 00:17:57.605 "data_offset": 2048, 00:17:57.605 "data_size": 63488 00:17:57.605 }, 00:17:57.605 { 00:17:57.605 "name": "BaseBdev4", 00:17:57.605 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:17:57.605 "is_configured": true, 00:17:57.605 "data_offset": 2048, 00:17:57.605 "data_size": 63488 00:17:57.605 } 00:17:57.605 ] 00:17:57.605 }' 00:17:57.605 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.605 13:40:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:58.171 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.171 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:58.171 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:58.171 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:58.429 [2024-07-15 13:40:45.860372] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.429 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.687 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.687 "name": "Existed_Raid", 00:17:58.687 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:17:58.687 "strip_size_kb": 0, 00:17:58.687 "state": "configuring", 00:17:58.687 "raid_level": "raid1", 00:17:58.687 "superblock": true, 00:17:58.687 "num_base_bdevs": 4, 00:17:58.687 "num_base_bdevs_discovered": 2, 00:17:58.687 "num_base_bdevs_operational": 4, 00:17:58.687 "base_bdevs_list": [ 00:17:58.687 { 00:17:58.687 "name": "BaseBdev1", 00:17:58.687 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:17:58.687 "is_configured": true, 00:17:58.687 "data_offset": 2048, 00:17:58.687 "data_size": 63488 00:17:58.687 }, 00:17:58.687 { 00:17:58.687 "name": null, 00:17:58.687 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:17:58.687 "is_configured": false, 00:17:58.687 "data_offset": 2048, 00:17:58.687 "data_size": 63488 00:17:58.687 }, 00:17:58.687 { 00:17:58.687 "name": null, 00:17:58.687 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:17:58.687 "is_configured": false, 00:17:58.687 "data_offset": 2048, 00:17:58.687 "data_size": 63488 00:17:58.687 }, 00:17:58.687 { 00:17:58.687 "name": "BaseBdev4", 00:17:58.687 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:17:58.687 "is_configured": true, 00:17:58.687 "data_offset": 2048, 00:17:58.687 "data_size": 63488 00:17:58.687 } 00:17:58.687 ] 00:17:58.687 }' 00:17:58.687 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.687 13:40:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:58.944 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:58.944 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.202 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:59.202 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:59.460 [2024-07-15 13:40:46.887057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:59.460 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:59.460 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.460 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.461 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.461 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.461 "name": "Existed_Raid", 00:17:59.461 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:17:59.461 "strip_size_kb": 0, 00:17:59.461 "state": "configuring", 00:17:59.461 "raid_level": "raid1", 00:17:59.461 "superblock": true, 00:17:59.461 "num_base_bdevs": 4, 00:17:59.461 "num_base_bdevs_discovered": 3, 00:17:59.461 "num_base_bdevs_operational": 4, 00:17:59.461 "base_bdevs_list": [ 00:17:59.461 { 00:17:59.461 "name": "BaseBdev1", 00:17:59.461 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:17:59.461 "is_configured": true, 00:17:59.461 "data_offset": 2048, 00:17:59.461 "data_size": 63488 00:17:59.461 }, 00:17:59.461 { 00:17:59.461 "name": null, 00:17:59.461 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:17:59.461 "is_configured": false, 00:17:59.461 "data_offset": 2048, 00:17:59.461 "data_size": 63488 00:17:59.461 }, 00:17:59.461 { 00:17:59.461 "name": "BaseBdev3", 00:17:59.461 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:17:59.461 "is_configured": true, 00:17:59.461 "data_offset": 2048, 00:17:59.461 "data_size": 63488 00:17:59.461 }, 00:17:59.461 { 00:17:59.461 "name": "BaseBdev4", 00:17:59.461 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:17:59.461 "is_configured": true, 00:17:59.461 "data_offset": 2048, 00:17:59.461 "data_size": 63488 00:17:59.461 } 00:17:59.461 ] 00:17:59.461 }' 00:17:59.461 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.461 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:00.026 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.026 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:00.283 [2024-07-15 13:40:47.873624] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.283 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.540 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.540 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.540 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.540 "name": "Existed_Raid", 00:18:00.540 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:18:00.540 "strip_size_kb": 0, 00:18:00.540 "state": "configuring", 00:18:00.540 "raid_level": "raid1", 00:18:00.540 "superblock": true, 00:18:00.540 "num_base_bdevs": 4, 00:18:00.540 "num_base_bdevs_discovered": 2, 00:18:00.540 "num_base_bdevs_operational": 4, 00:18:00.540 "base_bdevs_list": [ 00:18:00.540 { 00:18:00.540 "name": null, 00:18:00.540 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:18:00.540 "is_configured": false, 00:18:00.540 "data_offset": 2048, 00:18:00.540 "data_size": 63488 00:18:00.540 }, 00:18:00.540 { 00:18:00.540 "name": null, 00:18:00.540 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:18:00.540 "is_configured": false, 00:18:00.540 "data_offset": 2048, 00:18:00.540 "data_size": 63488 00:18:00.540 }, 00:18:00.540 { 00:18:00.540 "name": "BaseBdev3", 00:18:00.540 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:18:00.540 "is_configured": true, 00:18:00.540 "data_offset": 2048, 00:18:00.540 "data_size": 63488 00:18:00.540 }, 00:18:00.540 { 00:18:00.540 "name": "BaseBdev4", 00:18:00.540 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:18:00.540 "is_configured": true, 00:18:00.540 "data_offset": 2048, 00:18:00.541 "data_size": 63488 00:18:00.541 } 00:18:00.541 ] 00:18:00.541 }' 00:18:00.541 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.541 13:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:01.106 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.106 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:01.364 [2024-07-15 13:40:48.907602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.364 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.621 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.621 "name": "Existed_Raid", 00:18:01.621 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:18:01.621 "strip_size_kb": 0, 00:18:01.621 "state": "configuring", 00:18:01.621 "raid_level": "raid1", 00:18:01.621 "superblock": true, 00:18:01.621 "num_base_bdevs": 4, 00:18:01.621 "num_base_bdevs_discovered": 3, 00:18:01.621 "num_base_bdevs_operational": 4, 00:18:01.621 "base_bdevs_list": [ 00:18:01.621 { 00:18:01.621 "name": null, 00:18:01.621 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:18:01.621 "is_configured": false, 00:18:01.621 "data_offset": 2048, 00:18:01.621 "data_size": 63488 00:18:01.621 }, 00:18:01.621 { 00:18:01.621 "name": "BaseBdev2", 00:18:01.621 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:18:01.621 "is_configured": true, 00:18:01.621 "data_offset": 2048, 00:18:01.621 "data_size": 63488 00:18:01.621 }, 00:18:01.621 { 00:18:01.621 "name": "BaseBdev3", 00:18:01.621 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:18:01.621 "is_configured": true, 00:18:01.621 "data_offset": 2048, 00:18:01.621 "data_size": 63488 00:18:01.621 }, 00:18:01.621 { 00:18:01.621 "name": "BaseBdev4", 00:18:01.622 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:18:01.622 "is_configured": true, 00:18:01.622 "data_offset": 2048, 00:18:01.622 "data_size": 63488 00:18:01.622 } 00:18:01.622 ] 00:18:01.622 }' 00:18:01.622 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.622 13:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:02.186 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.186 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:02.186 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:02.186 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.186 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:02.444 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ed4f0d2d-5b7a-445a-8d78-316bd45085fd 00:18:02.702 [2024-07-15 13:40:50.125734] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:02.702 [2024-07-15 13:40:50.125880] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7d5900 00:18:02.702 [2024-07-15 13:40:50.125889] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:02.702 [2024-07-15 13:40:50.126021] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7ca930 00:18:02.702 [2024-07-15 13:40:50.126127] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7d5900 00:18:02.702 [2024-07-15 13:40:50.126134] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7d5900 00:18:02.702 [2024-07-15 13:40:50.126202] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:02.702 NewBaseBdev 00:18:02.702 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:02.702 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:02.702 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:02.702 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:02.702 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:02.702 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:02.702 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.960 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:02.960 [ 00:18:02.960 { 00:18:02.960 "name": "NewBaseBdev", 00:18:02.960 "aliases": [ 00:18:02.960 "ed4f0d2d-5b7a-445a-8d78-316bd45085fd" 00:18:02.960 ], 00:18:02.960 "product_name": "Malloc disk", 00:18:02.960 "block_size": 512, 00:18:02.960 "num_blocks": 65536, 00:18:02.960 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:18:02.960 "assigned_rate_limits": { 00:18:02.960 "rw_ios_per_sec": 0, 00:18:02.960 "rw_mbytes_per_sec": 0, 00:18:02.960 "r_mbytes_per_sec": 0, 00:18:02.960 "w_mbytes_per_sec": 0 00:18:02.960 }, 00:18:02.960 "claimed": true, 00:18:02.960 "claim_type": "exclusive_write", 00:18:02.960 "zoned": false, 00:18:02.960 "supported_io_types": { 00:18:02.960 "read": true, 00:18:02.960 "write": true, 00:18:02.960 "unmap": true, 00:18:02.960 "flush": true, 00:18:02.960 "reset": true, 00:18:02.960 "nvme_admin": false, 00:18:02.960 "nvme_io": false, 00:18:02.960 "nvme_io_md": false, 00:18:02.960 "write_zeroes": true, 00:18:02.960 "zcopy": true, 00:18:02.960 "get_zone_info": false, 00:18:02.960 "zone_management": false, 00:18:02.960 "zone_append": false, 00:18:02.960 "compare": false, 00:18:02.960 "compare_and_write": false, 00:18:02.960 "abort": true, 00:18:02.960 "seek_hole": false, 00:18:02.960 "seek_data": false, 00:18:02.960 "copy": true, 00:18:02.960 "nvme_iov_md": false 00:18:02.960 }, 00:18:02.960 "memory_domains": [ 00:18:02.960 { 00:18:02.960 "dma_device_id": "system", 00:18:02.960 "dma_device_type": 1 00:18:02.960 }, 00:18:02.960 { 00:18:02.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.960 "dma_device_type": 2 00:18:02.960 } 00:18:02.960 ], 00:18:02.960 "driver_specific": {} 00:18:02.960 } 00:18:02.960 ] 00:18:02.960 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:02.960 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:02.960 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:02.960 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:02.960 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.961 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.219 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.219 "name": "Existed_Raid", 00:18:03.219 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:18:03.219 "strip_size_kb": 0, 00:18:03.219 "state": "online", 00:18:03.219 "raid_level": "raid1", 00:18:03.219 "superblock": true, 00:18:03.219 "num_base_bdevs": 4, 00:18:03.219 "num_base_bdevs_discovered": 4, 00:18:03.219 "num_base_bdevs_operational": 4, 00:18:03.219 "base_bdevs_list": [ 00:18:03.219 { 00:18:03.219 "name": "NewBaseBdev", 00:18:03.219 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:18:03.219 "is_configured": true, 00:18:03.219 "data_offset": 2048, 00:18:03.219 "data_size": 63488 00:18:03.219 }, 00:18:03.219 { 00:18:03.219 "name": "BaseBdev2", 00:18:03.219 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:18:03.219 "is_configured": true, 00:18:03.219 "data_offset": 2048, 00:18:03.219 "data_size": 63488 00:18:03.219 }, 00:18:03.219 { 00:18:03.219 "name": "BaseBdev3", 00:18:03.219 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:18:03.219 "is_configured": true, 00:18:03.219 "data_offset": 2048, 00:18:03.219 "data_size": 63488 00:18:03.219 }, 00:18:03.219 { 00:18:03.219 "name": "BaseBdev4", 00:18:03.219 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:18:03.219 "is_configured": true, 00:18:03.219 "data_offset": 2048, 00:18:03.219 "data_size": 63488 00:18:03.219 } 00:18:03.219 ] 00:18:03.219 }' 00:18:03.219 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.219 13:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:03.785 [2024-07-15 13:40:51.309056] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:03.785 "name": "Existed_Raid", 00:18:03.785 "aliases": [ 00:18:03.785 "87fddf7f-2966-4c10-9db7-7057cf6ff160" 00:18:03.785 ], 00:18:03.785 "product_name": "Raid Volume", 00:18:03.785 "block_size": 512, 00:18:03.785 "num_blocks": 63488, 00:18:03.785 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:18:03.785 "assigned_rate_limits": { 00:18:03.785 "rw_ios_per_sec": 0, 00:18:03.785 "rw_mbytes_per_sec": 0, 00:18:03.785 "r_mbytes_per_sec": 0, 00:18:03.785 "w_mbytes_per_sec": 0 00:18:03.785 }, 00:18:03.785 "claimed": false, 00:18:03.785 "zoned": false, 00:18:03.785 "supported_io_types": { 00:18:03.785 "read": true, 00:18:03.785 "write": true, 00:18:03.785 "unmap": false, 00:18:03.785 "flush": false, 00:18:03.785 "reset": true, 00:18:03.785 "nvme_admin": false, 00:18:03.785 "nvme_io": false, 00:18:03.785 "nvme_io_md": false, 00:18:03.785 "write_zeroes": true, 00:18:03.785 "zcopy": false, 00:18:03.785 "get_zone_info": false, 00:18:03.785 "zone_management": false, 00:18:03.785 "zone_append": false, 00:18:03.785 "compare": false, 00:18:03.785 "compare_and_write": false, 00:18:03.785 "abort": false, 00:18:03.785 "seek_hole": false, 00:18:03.785 "seek_data": false, 00:18:03.785 "copy": false, 00:18:03.785 "nvme_iov_md": false 00:18:03.785 }, 00:18:03.785 "memory_domains": [ 00:18:03.785 { 00:18:03.785 "dma_device_id": "system", 00:18:03.785 "dma_device_type": 1 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.785 "dma_device_type": 2 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "dma_device_id": "system", 00:18:03.785 "dma_device_type": 1 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.785 "dma_device_type": 2 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "dma_device_id": "system", 00:18:03.785 "dma_device_type": 1 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.785 "dma_device_type": 2 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "dma_device_id": "system", 00:18:03.785 "dma_device_type": 1 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.785 "dma_device_type": 2 00:18:03.785 } 00:18:03.785 ], 00:18:03.785 "driver_specific": { 00:18:03.785 "raid": { 00:18:03.785 "uuid": "87fddf7f-2966-4c10-9db7-7057cf6ff160", 00:18:03.785 "strip_size_kb": 0, 00:18:03.785 "state": "online", 00:18:03.785 "raid_level": "raid1", 00:18:03.785 "superblock": true, 00:18:03.785 "num_base_bdevs": 4, 00:18:03.785 "num_base_bdevs_discovered": 4, 00:18:03.785 "num_base_bdevs_operational": 4, 00:18:03.785 "base_bdevs_list": [ 00:18:03.785 { 00:18:03.785 "name": "NewBaseBdev", 00:18:03.785 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:18:03.785 "is_configured": true, 00:18:03.785 "data_offset": 2048, 00:18:03.785 "data_size": 63488 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "name": "BaseBdev2", 00:18:03.785 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:18:03.785 "is_configured": true, 00:18:03.785 "data_offset": 2048, 00:18:03.785 "data_size": 63488 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "name": "BaseBdev3", 00:18:03.785 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:18:03.785 "is_configured": true, 00:18:03.785 "data_offset": 2048, 00:18:03.785 "data_size": 63488 00:18:03.785 }, 00:18:03.785 { 00:18:03.785 "name": "BaseBdev4", 00:18:03.785 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:18:03.785 "is_configured": true, 00:18:03.785 "data_offset": 2048, 00:18:03.785 "data_size": 63488 00:18:03.785 } 00:18:03.785 ] 00:18:03.785 } 00:18:03.785 } 00:18:03.785 }' 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:03.785 BaseBdev2 00:18:03.785 BaseBdev3 00:18:03.785 BaseBdev4' 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:03.785 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:03.786 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:04.043 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:04.043 "name": "NewBaseBdev", 00:18:04.043 "aliases": [ 00:18:04.043 "ed4f0d2d-5b7a-445a-8d78-316bd45085fd" 00:18:04.043 ], 00:18:04.043 "product_name": "Malloc disk", 00:18:04.043 "block_size": 512, 00:18:04.043 "num_blocks": 65536, 00:18:04.043 "uuid": "ed4f0d2d-5b7a-445a-8d78-316bd45085fd", 00:18:04.043 "assigned_rate_limits": { 00:18:04.043 "rw_ios_per_sec": 0, 00:18:04.043 "rw_mbytes_per_sec": 0, 00:18:04.043 "r_mbytes_per_sec": 0, 00:18:04.043 "w_mbytes_per_sec": 0 00:18:04.043 }, 00:18:04.043 "claimed": true, 00:18:04.043 "claim_type": "exclusive_write", 00:18:04.043 "zoned": false, 00:18:04.043 "supported_io_types": { 00:18:04.043 "read": true, 00:18:04.043 "write": true, 00:18:04.043 "unmap": true, 00:18:04.043 "flush": true, 00:18:04.043 "reset": true, 00:18:04.043 "nvme_admin": false, 00:18:04.043 "nvme_io": false, 00:18:04.043 "nvme_io_md": false, 00:18:04.043 "write_zeroes": true, 00:18:04.043 "zcopy": true, 00:18:04.043 "get_zone_info": false, 00:18:04.043 "zone_management": false, 00:18:04.043 "zone_append": false, 00:18:04.043 "compare": false, 00:18:04.043 "compare_and_write": false, 00:18:04.043 "abort": true, 00:18:04.043 "seek_hole": false, 00:18:04.043 "seek_data": false, 00:18:04.043 "copy": true, 00:18:04.043 "nvme_iov_md": false 00:18:04.043 }, 00:18:04.043 "memory_domains": [ 00:18:04.043 { 00:18:04.043 "dma_device_id": "system", 00:18:04.043 "dma_device_type": 1 00:18:04.043 }, 00:18:04.043 { 00:18:04.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.043 "dma_device_type": 2 00:18:04.043 } 00:18:04.043 ], 00:18:04.043 "driver_specific": {} 00:18:04.043 }' 00:18:04.043 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.043 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.043 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:04.043 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:04.301 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:04.559 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:04.559 "name": "BaseBdev2", 00:18:04.559 "aliases": [ 00:18:04.559 "5981284b-007b-4cbe-971d-71f46af5a5b6" 00:18:04.559 ], 00:18:04.559 "product_name": "Malloc disk", 00:18:04.559 "block_size": 512, 00:18:04.559 "num_blocks": 65536, 00:18:04.559 "uuid": "5981284b-007b-4cbe-971d-71f46af5a5b6", 00:18:04.559 "assigned_rate_limits": { 00:18:04.559 "rw_ios_per_sec": 0, 00:18:04.559 "rw_mbytes_per_sec": 0, 00:18:04.559 "r_mbytes_per_sec": 0, 00:18:04.559 "w_mbytes_per_sec": 0 00:18:04.559 }, 00:18:04.559 "claimed": true, 00:18:04.559 "claim_type": "exclusive_write", 00:18:04.559 "zoned": false, 00:18:04.559 "supported_io_types": { 00:18:04.559 "read": true, 00:18:04.559 "write": true, 00:18:04.559 "unmap": true, 00:18:04.559 "flush": true, 00:18:04.559 "reset": true, 00:18:04.559 "nvme_admin": false, 00:18:04.559 "nvme_io": false, 00:18:04.559 "nvme_io_md": false, 00:18:04.559 "write_zeroes": true, 00:18:04.559 "zcopy": true, 00:18:04.559 "get_zone_info": false, 00:18:04.559 "zone_management": false, 00:18:04.559 "zone_append": false, 00:18:04.559 "compare": false, 00:18:04.559 "compare_and_write": false, 00:18:04.559 "abort": true, 00:18:04.559 "seek_hole": false, 00:18:04.559 "seek_data": false, 00:18:04.559 "copy": true, 00:18:04.559 "nvme_iov_md": false 00:18:04.559 }, 00:18:04.559 "memory_domains": [ 00:18:04.559 { 00:18:04.559 "dma_device_id": "system", 00:18:04.559 "dma_device_type": 1 00:18:04.559 }, 00:18:04.559 { 00:18:04.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.559 "dma_device_type": 2 00:18:04.559 } 00:18:04.559 ], 00:18:04.559 "driver_specific": {} 00:18:04.559 }' 00:18:04.559 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.559 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.559 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:04.559 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.559 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:04.818 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:05.077 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:05.077 "name": "BaseBdev3", 00:18:05.077 "aliases": [ 00:18:05.077 "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e" 00:18:05.077 ], 00:18:05.077 "product_name": "Malloc disk", 00:18:05.077 "block_size": 512, 00:18:05.077 "num_blocks": 65536, 00:18:05.077 "uuid": "e3c52f7f-4db2-4a2b-8721-4c9da4e4db9e", 00:18:05.077 "assigned_rate_limits": { 00:18:05.077 "rw_ios_per_sec": 0, 00:18:05.077 "rw_mbytes_per_sec": 0, 00:18:05.077 "r_mbytes_per_sec": 0, 00:18:05.077 "w_mbytes_per_sec": 0 00:18:05.077 }, 00:18:05.077 "claimed": true, 00:18:05.077 "claim_type": "exclusive_write", 00:18:05.077 "zoned": false, 00:18:05.077 "supported_io_types": { 00:18:05.077 "read": true, 00:18:05.077 "write": true, 00:18:05.077 "unmap": true, 00:18:05.077 "flush": true, 00:18:05.077 "reset": true, 00:18:05.077 "nvme_admin": false, 00:18:05.077 "nvme_io": false, 00:18:05.077 "nvme_io_md": false, 00:18:05.077 "write_zeroes": true, 00:18:05.077 "zcopy": true, 00:18:05.077 "get_zone_info": false, 00:18:05.077 "zone_management": false, 00:18:05.077 "zone_append": false, 00:18:05.077 "compare": false, 00:18:05.077 "compare_and_write": false, 00:18:05.077 "abort": true, 00:18:05.077 "seek_hole": false, 00:18:05.077 "seek_data": false, 00:18:05.077 "copy": true, 00:18:05.077 "nvme_iov_md": false 00:18:05.077 }, 00:18:05.077 "memory_domains": [ 00:18:05.077 { 00:18:05.077 "dma_device_id": "system", 00:18:05.077 "dma_device_type": 1 00:18:05.077 }, 00:18:05.077 { 00:18:05.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.077 "dma_device_type": 2 00:18:05.077 } 00:18:05.077 ], 00:18:05.077 "driver_specific": {} 00:18:05.077 }' 00:18:05.077 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.077 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.077 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:05.077 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.077 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.077 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:05.335 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:05.594 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:05.594 "name": "BaseBdev4", 00:18:05.594 "aliases": [ 00:18:05.594 "3478ca77-7aa1-4669-91eb-d46c649f20e7" 00:18:05.594 ], 00:18:05.594 "product_name": "Malloc disk", 00:18:05.594 "block_size": 512, 00:18:05.594 "num_blocks": 65536, 00:18:05.594 "uuid": "3478ca77-7aa1-4669-91eb-d46c649f20e7", 00:18:05.594 "assigned_rate_limits": { 00:18:05.594 "rw_ios_per_sec": 0, 00:18:05.594 "rw_mbytes_per_sec": 0, 00:18:05.594 "r_mbytes_per_sec": 0, 00:18:05.594 "w_mbytes_per_sec": 0 00:18:05.594 }, 00:18:05.594 "claimed": true, 00:18:05.594 "claim_type": "exclusive_write", 00:18:05.594 "zoned": false, 00:18:05.594 "supported_io_types": { 00:18:05.594 "read": true, 00:18:05.594 "write": true, 00:18:05.595 "unmap": true, 00:18:05.595 "flush": true, 00:18:05.595 "reset": true, 00:18:05.595 "nvme_admin": false, 00:18:05.595 "nvme_io": false, 00:18:05.595 "nvme_io_md": false, 00:18:05.595 "write_zeroes": true, 00:18:05.595 "zcopy": true, 00:18:05.595 "get_zone_info": false, 00:18:05.595 "zone_management": false, 00:18:05.595 "zone_append": false, 00:18:05.595 "compare": false, 00:18:05.595 "compare_and_write": false, 00:18:05.595 "abort": true, 00:18:05.595 "seek_hole": false, 00:18:05.595 "seek_data": false, 00:18:05.595 "copy": true, 00:18:05.595 "nvme_iov_md": false 00:18:05.595 }, 00:18:05.595 "memory_domains": [ 00:18:05.595 { 00:18:05.595 "dma_device_id": "system", 00:18:05.595 "dma_device_type": 1 00:18:05.595 }, 00:18:05.595 { 00:18:05.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.595 "dma_device_type": 2 00:18:05.595 } 00:18:05.595 ], 00:18:05.595 "driver_specific": {} 00:18:05.595 }' 00:18:05.595 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.595 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.595 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:05.595 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.595 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.595 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:05.595 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.853 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.853 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.853 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.853 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.853 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.853 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:06.112 [2024-07-15 13:40:53.518582] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:06.112 [2024-07-15 13:40:53.518610] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:06.112 [2024-07-15 13:40:53.518654] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:06.112 [2024-07-15 13:40:53.518849] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:06.112 [2024-07-15 13:40:53.518858] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d5900 name Existed_Raid, state offline 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 49077 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 49077 ']' 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 49077 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 49077 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 49077' 00:18:06.112 killing process with pid 49077 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 49077 00:18:06.112 [2024-07-15 13:40:53.583180] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:06.112 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 49077 00:18:06.112 [2024-07-15 13:40:53.623045] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:06.373 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:06.373 00:18:06.373 real 0m25.057s 00:18:06.373 user 0m45.693s 00:18:06.373 sys 0m4.880s 00:18:06.373 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:06.373 13:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.373 ************************************ 00:18:06.373 END TEST raid_state_function_test_sb 00:18:06.373 ************************************ 00:18:06.373 13:40:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:06.373 13:40:53 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:18:06.373 13:40:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:06.373 13:40:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:06.373 13:40:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:06.373 ************************************ 00:18:06.373 START TEST raid_superblock_test 00:18:06.373 ************************************ 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=53025 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 53025 /var/tmp/spdk-raid.sock 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 53025 ']' 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:06.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:06.373 13:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.373 [2024-07-15 13:40:53.973343] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:18:06.373 [2024-07-15 13:40:53.973400] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid53025 ] 00:18:06.697 [2024-07-15 13:40:54.061533] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:06.697 [2024-07-15 13:40:54.148130] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:06.697 [2024-07-15 13:40:54.202112] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:06.697 [2024-07-15 13:40:54.202140] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:07.266 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:07.526 malloc1 00:18:07.526 13:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:07.526 [2024-07-15 13:40:55.098203] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:07.526 [2024-07-15 13:40:55.098242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:07.526 [2024-07-15 13:40:55.098255] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2039260 00:18:07.526 [2024-07-15 13:40:55.098263] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:07.526 [2024-07-15 13:40:55.099326] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:07.526 [2024-07-15 13:40:55.099349] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:07.526 pt1 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:07.526 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:07.785 malloc2 00:18:07.785 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:08.044 [2024-07-15 13:40:55.442966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:08.044 [2024-07-15 13:40:55.443004] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.044 [2024-07-15 13:40:55.443017] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e3310 00:18:08.044 [2024-07-15 13:40:55.443026] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.044 [2024-07-15 13:40:55.443943] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.044 [2024-07-15 13:40:55.443967] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:08.044 pt2 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:08.044 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:08.045 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:08.045 malloc3 00:18:08.045 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:08.303 [2024-07-15 13:40:55.807650] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:08.303 [2024-07-15 13:40:55.807684] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.303 [2024-07-15 13:40:55.807697] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e6e70 00:18:08.303 [2024-07-15 13:40:55.807705] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.303 [2024-07-15 13:40:55.808638] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.303 [2024-07-15 13:40:55.808661] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:08.303 pt3 00:18:08.303 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:08.303 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:08.304 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:08.304 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:08.304 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:08.304 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:08.304 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:08.304 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:08.304 13:40:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:08.563 malloc4 00:18:08.563 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:08.563 [2024-07-15 13:40:56.172315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:08.563 [2024-07-15 13:40:56.172352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.563 [2024-07-15 13:40:56.172364] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e3d40 00:18:08.563 [2024-07-15 13:40:56.172372] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.563 [2024-07-15 13:40:56.173351] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.563 [2024-07-15 13:40:56.173374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:08.563 pt4 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:08.822 [2024-07-15 13:40:56.348782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:08.822 [2024-07-15 13:40:56.349590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:08.822 [2024-07-15 13:40:56.349630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:08.822 [2024-07-15 13:40:56.349659] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:08.822 [2024-07-15 13:40:56.349774] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21e7180 00:18:08.822 [2024-07-15 13:40:56.349782] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:08.822 [2024-07-15 13:40:56.349909] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21eb580 00:18:08.822 [2024-07-15 13:40:56.350018] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21e7180 00:18:08.822 [2024-07-15 13:40:56.350026] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21e7180 00:18:08.822 [2024-07-15 13:40:56.350087] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.822 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:09.082 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.082 "name": "raid_bdev1", 00:18:09.082 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:09.082 "strip_size_kb": 0, 00:18:09.082 "state": "online", 00:18:09.082 "raid_level": "raid1", 00:18:09.082 "superblock": true, 00:18:09.082 "num_base_bdevs": 4, 00:18:09.082 "num_base_bdevs_discovered": 4, 00:18:09.082 "num_base_bdevs_operational": 4, 00:18:09.082 "base_bdevs_list": [ 00:18:09.082 { 00:18:09.082 "name": "pt1", 00:18:09.082 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:09.082 "is_configured": true, 00:18:09.082 "data_offset": 2048, 00:18:09.082 "data_size": 63488 00:18:09.082 }, 00:18:09.082 { 00:18:09.082 "name": "pt2", 00:18:09.082 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:09.082 "is_configured": true, 00:18:09.082 "data_offset": 2048, 00:18:09.082 "data_size": 63488 00:18:09.082 }, 00:18:09.082 { 00:18:09.082 "name": "pt3", 00:18:09.082 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:09.082 "is_configured": true, 00:18:09.082 "data_offset": 2048, 00:18:09.082 "data_size": 63488 00:18:09.082 }, 00:18:09.082 { 00:18:09.082 "name": "pt4", 00:18:09.082 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:09.082 "is_configured": true, 00:18:09.082 "data_offset": 2048, 00:18:09.082 "data_size": 63488 00:18:09.082 } 00:18:09.082 ] 00:18:09.082 }' 00:18:09.082 13:40:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.082 13:40:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:09.650 [2024-07-15 13:40:57.183128] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:09.650 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:09.650 "name": "raid_bdev1", 00:18:09.650 "aliases": [ 00:18:09.650 "09777c47-dfe1-4520-a042-c98d153d2bc9" 00:18:09.650 ], 00:18:09.650 "product_name": "Raid Volume", 00:18:09.650 "block_size": 512, 00:18:09.650 "num_blocks": 63488, 00:18:09.650 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:09.650 "assigned_rate_limits": { 00:18:09.650 "rw_ios_per_sec": 0, 00:18:09.650 "rw_mbytes_per_sec": 0, 00:18:09.650 "r_mbytes_per_sec": 0, 00:18:09.650 "w_mbytes_per_sec": 0 00:18:09.650 }, 00:18:09.650 "claimed": false, 00:18:09.650 "zoned": false, 00:18:09.650 "supported_io_types": { 00:18:09.650 "read": true, 00:18:09.650 "write": true, 00:18:09.650 "unmap": false, 00:18:09.650 "flush": false, 00:18:09.650 "reset": true, 00:18:09.650 "nvme_admin": false, 00:18:09.650 "nvme_io": false, 00:18:09.650 "nvme_io_md": false, 00:18:09.650 "write_zeroes": true, 00:18:09.650 "zcopy": false, 00:18:09.650 "get_zone_info": false, 00:18:09.650 "zone_management": false, 00:18:09.650 "zone_append": false, 00:18:09.650 "compare": false, 00:18:09.650 "compare_and_write": false, 00:18:09.650 "abort": false, 00:18:09.650 "seek_hole": false, 00:18:09.650 "seek_data": false, 00:18:09.650 "copy": false, 00:18:09.650 "nvme_iov_md": false 00:18:09.650 }, 00:18:09.650 "memory_domains": [ 00:18:09.650 { 00:18:09.650 "dma_device_id": "system", 00:18:09.651 "dma_device_type": 1 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.651 "dma_device_type": 2 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "dma_device_id": "system", 00:18:09.651 "dma_device_type": 1 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.651 "dma_device_type": 2 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "dma_device_id": "system", 00:18:09.651 "dma_device_type": 1 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.651 "dma_device_type": 2 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "dma_device_id": "system", 00:18:09.651 "dma_device_type": 1 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.651 "dma_device_type": 2 00:18:09.651 } 00:18:09.651 ], 00:18:09.651 "driver_specific": { 00:18:09.651 "raid": { 00:18:09.651 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:09.651 "strip_size_kb": 0, 00:18:09.651 "state": "online", 00:18:09.651 "raid_level": "raid1", 00:18:09.651 "superblock": true, 00:18:09.651 "num_base_bdevs": 4, 00:18:09.651 "num_base_bdevs_discovered": 4, 00:18:09.651 "num_base_bdevs_operational": 4, 00:18:09.651 "base_bdevs_list": [ 00:18:09.651 { 00:18:09.651 "name": "pt1", 00:18:09.651 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:09.651 "is_configured": true, 00:18:09.651 "data_offset": 2048, 00:18:09.651 "data_size": 63488 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "name": "pt2", 00:18:09.651 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:09.651 "is_configured": true, 00:18:09.651 "data_offset": 2048, 00:18:09.651 "data_size": 63488 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "name": "pt3", 00:18:09.651 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:09.651 "is_configured": true, 00:18:09.651 "data_offset": 2048, 00:18:09.651 "data_size": 63488 00:18:09.651 }, 00:18:09.651 { 00:18:09.651 "name": "pt4", 00:18:09.651 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:09.651 "is_configured": true, 00:18:09.651 "data_offset": 2048, 00:18:09.651 "data_size": 63488 00:18:09.651 } 00:18:09.651 ] 00:18:09.651 } 00:18:09.651 } 00:18:09.651 }' 00:18:09.651 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:09.651 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:09.651 pt2 00:18:09.651 pt3 00:18:09.651 pt4' 00:18:09.651 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:09.651 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:09.651 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:09.909 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:09.909 "name": "pt1", 00:18:09.909 "aliases": [ 00:18:09.909 "00000000-0000-0000-0000-000000000001" 00:18:09.909 ], 00:18:09.909 "product_name": "passthru", 00:18:09.909 "block_size": 512, 00:18:09.909 "num_blocks": 65536, 00:18:09.909 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:09.909 "assigned_rate_limits": { 00:18:09.909 "rw_ios_per_sec": 0, 00:18:09.909 "rw_mbytes_per_sec": 0, 00:18:09.909 "r_mbytes_per_sec": 0, 00:18:09.909 "w_mbytes_per_sec": 0 00:18:09.909 }, 00:18:09.909 "claimed": true, 00:18:09.909 "claim_type": "exclusive_write", 00:18:09.909 "zoned": false, 00:18:09.909 "supported_io_types": { 00:18:09.909 "read": true, 00:18:09.909 "write": true, 00:18:09.909 "unmap": true, 00:18:09.909 "flush": true, 00:18:09.909 "reset": true, 00:18:09.909 "nvme_admin": false, 00:18:09.909 "nvme_io": false, 00:18:09.909 "nvme_io_md": false, 00:18:09.909 "write_zeroes": true, 00:18:09.909 "zcopy": true, 00:18:09.909 "get_zone_info": false, 00:18:09.909 "zone_management": false, 00:18:09.909 "zone_append": false, 00:18:09.909 "compare": false, 00:18:09.909 "compare_and_write": false, 00:18:09.909 "abort": true, 00:18:09.909 "seek_hole": false, 00:18:09.909 "seek_data": false, 00:18:09.909 "copy": true, 00:18:09.909 "nvme_iov_md": false 00:18:09.909 }, 00:18:09.909 "memory_domains": [ 00:18:09.909 { 00:18:09.909 "dma_device_id": "system", 00:18:09.909 "dma_device_type": 1 00:18:09.909 }, 00:18:09.909 { 00:18:09.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.909 "dma_device_type": 2 00:18:09.909 } 00:18:09.909 ], 00:18:09.909 "driver_specific": { 00:18:09.909 "passthru": { 00:18:09.909 "name": "pt1", 00:18:09.909 "base_bdev_name": "malloc1" 00:18:09.909 } 00:18:09.909 } 00:18:09.909 }' 00:18:09.909 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.909 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.909 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:09.909 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:10.169 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:10.428 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:10.428 "name": "pt2", 00:18:10.428 "aliases": [ 00:18:10.428 "00000000-0000-0000-0000-000000000002" 00:18:10.428 ], 00:18:10.428 "product_name": "passthru", 00:18:10.428 "block_size": 512, 00:18:10.428 "num_blocks": 65536, 00:18:10.428 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:10.428 "assigned_rate_limits": { 00:18:10.428 "rw_ios_per_sec": 0, 00:18:10.428 "rw_mbytes_per_sec": 0, 00:18:10.428 "r_mbytes_per_sec": 0, 00:18:10.428 "w_mbytes_per_sec": 0 00:18:10.428 }, 00:18:10.428 "claimed": true, 00:18:10.428 "claim_type": "exclusive_write", 00:18:10.428 "zoned": false, 00:18:10.428 "supported_io_types": { 00:18:10.428 "read": true, 00:18:10.428 "write": true, 00:18:10.428 "unmap": true, 00:18:10.428 "flush": true, 00:18:10.428 "reset": true, 00:18:10.428 "nvme_admin": false, 00:18:10.428 "nvme_io": false, 00:18:10.428 "nvme_io_md": false, 00:18:10.428 "write_zeroes": true, 00:18:10.428 "zcopy": true, 00:18:10.428 "get_zone_info": false, 00:18:10.428 "zone_management": false, 00:18:10.428 "zone_append": false, 00:18:10.428 "compare": false, 00:18:10.428 "compare_and_write": false, 00:18:10.428 "abort": true, 00:18:10.428 "seek_hole": false, 00:18:10.429 "seek_data": false, 00:18:10.429 "copy": true, 00:18:10.429 "nvme_iov_md": false 00:18:10.429 }, 00:18:10.429 "memory_domains": [ 00:18:10.429 { 00:18:10.429 "dma_device_id": "system", 00:18:10.429 "dma_device_type": 1 00:18:10.429 }, 00:18:10.429 { 00:18:10.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.429 "dma_device_type": 2 00:18:10.429 } 00:18:10.429 ], 00:18:10.429 "driver_specific": { 00:18:10.429 "passthru": { 00:18:10.429 "name": "pt2", 00:18:10.429 "base_bdev_name": "malloc2" 00:18:10.429 } 00:18:10.429 } 00:18:10.429 }' 00:18:10.429 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.429 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.429 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:10.429 13:40:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.429 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:10.687 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:10.947 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:10.947 "name": "pt3", 00:18:10.947 "aliases": [ 00:18:10.947 "00000000-0000-0000-0000-000000000003" 00:18:10.947 ], 00:18:10.947 "product_name": "passthru", 00:18:10.947 "block_size": 512, 00:18:10.947 "num_blocks": 65536, 00:18:10.947 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:10.947 "assigned_rate_limits": { 00:18:10.947 "rw_ios_per_sec": 0, 00:18:10.947 "rw_mbytes_per_sec": 0, 00:18:10.947 "r_mbytes_per_sec": 0, 00:18:10.947 "w_mbytes_per_sec": 0 00:18:10.947 }, 00:18:10.947 "claimed": true, 00:18:10.947 "claim_type": "exclusive_write", 00:18:10.947 "zoned": false, 00:18:10.947 "supported_io_types": { 00:18:10.947 "read": true, 00:18:10.947 "write": true, 00:18:10.947 "unmap": true, 00:18:10.947 "flush": true, 00:18:10.947 "reset": true, 00:18:10.947 "nvme_admin": false, 00:18:10.947 "nvme_io": false, 00:18:10.947 "nvme_io_md": false, 00:18:10.947 "write_zeroes": true, 00:18:10.947 "zcopy": true, 00:18:10.947 "get_zone_info": false, 00:18:10.947 "zone_management": false, 00:18:10.947 "zone_append": false, 00:18:10.947 "compare": false, 00:18:10.947 "compare_and_write": false, 00:18:10.947 "abort": true, 00:18:10.947 "seek_hole": false, 00:18:10.947 "seek_data": false, 00:18:10.947 "copy": true, 00:18:10.947 "nvme_iov_md": false 00:18:10.947 }, 00:18:10.947 "memory_domains": [ 00:18:10.947 { 00:18:10.947 "dma_device_id": "system", 00:18:10.947 "dma_device_type": 1 00:18:10.947 }, 00:18:10.947 { 00:18:10.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.947 "dma_device_type": 2 00:18:10.947 } 00:18:10.947 ], 00:18:10.947 "driver_specific": { 00:18:10.947 "passthru": { 00:18:10.947 "name": "pt3", 00:18:10.947 "base_bdev_name": "malloc3" 00:18:10.947 } 00:18:10.947 } 00:18:10.947 }' 00:18:10.947 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.947 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.947 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:10.947 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.947 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:11.206 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:11.465 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:11.465 "name": "pt4", 00:18:11.465 "aliases": [ 00:18:11.465 "00000000-0000-0000-0000-000000000004" 00:18:11.465 ], 00:18:11.465 "product_name": "passthru", 00:18:11.465 "block_size": 512, 00:18:11.465 "num_blocks": 65536, 00:18:11.465 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:11.465 "assigned_rate_limits": { 00:18:11.465 "rw_ios_per_sec": 0, 00:18:11.465 "rw_mbytes_per_sec": 0, 00:18:11.465 "r_mbytes_per_sec": 0, 00:18:11.465 "w_mbytes_per_sec": 0 00:18:11.465 }, 00:18:11.465 "claimed": true, 00:18:11.465 "claim_type": "exclusive_write", 00:18:11.465 "zoned": false, 00:18:11.465 "supported_io_types": { 00:18:11.465 "read": true, 00:18:11.465 "write": true, 00:18:11.465 "unmap": true, 00:18:11.465 "flush": true, 00:18:11.465 "reset": true, 00:18:11.465 "nvme_admin": false, 00:18:11.465 "nvme_io": false, 00:18:11.465 "nvme_io_md": false, 00:18:11.465 "write_zeroes": true, 00:18:11.465 "zcopy": true, 00:18:11.465 "get_zone_info": false, 00:18:11.465 "zone_management": false, 00:18:11.465 "zone_append": false, 00:18:11.465 "compare": false, 00:18:11.465 "compare_and_write": false, 00:18:11.465 "abort": true, 00:18:11.465 "seek_hole": false, 00:18:11.465 "seek_data": false, 00:18:11.466 "copy": true, 00:18:11.466 "nvme_iov_md": false 00:18:11.466 }, 00:18:11.466 "memory_domains": [ 00:18:11.466 { 00:18:11.466 "dma_device_id": "system", 00:18:11.466 "dma_device_type": 1 00:18:11.466 }, 00:18:11.466 { 00:18:11.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.466 "dma_device_type": 2 00:18:11.466 } 00:18:11.466 ], 00:18:11.466 "driver_specific": { 00:18:11.466 "passthru": { 00:18:11.466 "name": "pt4", 00:18:11.466 "base_bdev_name": "malloc4" 00:18:11.466 } 00:18:11.466 } 00:18:11.466 }' 00:18:11.466 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:11.466 13:40:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:11.466 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:11.466 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.466 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:11.725 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:11.984 [2024-07-15 13:40:59.384875] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:11.984 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=09777c47-dfe1-4520-a042-c98d153d2bc9 00:18:11.984 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 09777c47-dfe1-4520-a042-c98d153d2bc9 ']' 00:18:11.984 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:11.984 [2024-07-15 13:40:59.565124] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:11.984 [2024-07-15 13:40:59.565143] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:11.984 [2024-07-15 13:40:59.565181] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:11.984 [2024-07-15 13:40:59.565236] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:11.984 [2024-07-15 13:40:59.565244] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21e7180 name raid_bdev1, state offline 00:18:11.985 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.985 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:12.244 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:12.244 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:12.244 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:12.244 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:12.503 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:12.503 13:40:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:12.503 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:12.503 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:12.762 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:12.762 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:13.019 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:13.019 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:13.278 [2024-07-15 13:41:00.796409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:13.278 [2024-07-15 13:41:00.797466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:13.278 [2024-07-15 13:41:00.797518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:13.278 [2024-07-15 13:41:00.797542] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:13.278 [2024-07-15 13:41:00.797577] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:13.278 [2024-07-15 13:41:00.797609] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:13.278 [2024-07-15 13:41:00.797630] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:13.278 [2024-07-15 13:41:00.797644] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:13.278 [2024-07-15 13:41:00.797657] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:13.278 [2024-07-15 13:41:00.797665] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21e3540 name raid_bdev1, state configuring 00:18:13.278 request: 00:18:13.278 { 00:18:13.278 "name": "raid_bdev1", 00:18:13.278 "raid_level": "raid1", 00:18:13.278 "base_bdevs": [ 00:18:13.278 "malloc1", 00:18:13.278 "malloc2", 00:18:13.278 "malloc3", 00:18:13.278 "malloc4" 00:18:13.278 ], 00:18:13.278 "superblock": false, 00:18:13.278 "method": "bdev_raid_create", 00:18:13.278 "req_id": 1 00:18:13.278 } 00:18:13.278 Got JSON-RPC error response 00:18:13.278 response: 00:18:13.278 { 00:18:13.278 "code": -17, 00:18:13.278 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:13.278 } 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.278 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:13.537 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:13.537 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:13.537 13:41:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:13.796 [2024-07-15 13:41:01.157293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:13.796 [2024-07-15 13:41:01.157332] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.796 [2024-07-15 13:41:01.157349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e50a0 00:18:13.796 [2024-07-15 13:41:01.157358] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.796 [2024-07-15 13:41:01.158573] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.796 [2024-07-15 13:41:01.158600] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:13.796 [2024-07-15 13:41:01.158654] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:13.796 [2024-07-15 13:41:01.158674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:13.796 pt1 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.796 "name": "raid_bdev1", 00:18:13.796 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:13.796 "strip_size_kb": 0, 00:18:13.796 "state": "configuring", 00:18:13.796 "raid_level": "raid1", 00:18:13.796 "superblock": true, 00:18:13.796 "num_base_bdevs": 4, 00:18:13.796 "num_base_bdevs_discovered": 1, 00:18:13.796 "num_base_bdevs_operational": 4, 00:18:13.796 "base_bdevs_list": [ 00:18:13.796 { 00:18:13.796 "name": "pt1", 00:18:13.796 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:13.796 "is_configured": true, 00:18:13.796 "data_offset": 2048, 00:18:13.796 "data_size": 63488 00:18:13.796 }, 00:18:13.796 { 00:18:13.796 "name": null, 00:18:13.796 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:13.796 "is_configured": false, 00:18:13.796 "data_offset": 2048, 00:18:13.796 "data_size": 63488 00:18:13.796 }, 00:18:13.796 { 00:18:13.796 "name": null, 00:18:13.796 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:13.796 "is_configured": false, 00:18:13.796 "data_offset": 2048, 00:18:13.796 "data_size": 63488 00:18:13.796 }, 00:18:13.796 { 00:18:13.796 "name": null, 00:18:13.796 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:13.796 "is_configured": false, 00:18:13.796 "data_offset": 2048, 00:18:13.796 "data_size": 63488 00:18:13.796 } 00:18:13.796 ] 00:18:13.796 }' 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.796 13:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.362 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:14.362 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:14.362 [2024-07-15 13:41:01.975426] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:14.362 [2024-07-15 13:41:01.975469] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.362 [2024-07-15 13:41:01.975482] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2037ee0 00:18:14.362 [2024-07-15 13:41:01.975490] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.362 [2024-07-15 13:41:01.975745] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.362 [2024-07-15 13:41:01.975774] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:14.362 [2024-07-15 13:41:01.975825] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:14.362 [2024-07-15 13:41:01.975840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:14.620 pt2 00:18:14.620 13:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:14.620 [2024-07-15 13:41:02.147872] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.620 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:14.878 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.878 "name": "raid_bdev1", 00:18:14.878 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:14.878 "strip_size_kb": 0, 00:18:14.878 "state": "configuring", 00:18:14.878 "raid_level": "raid1", 00:18:14.878 "superblock": true, 00:18:14.878 "num_base_bdevs": 4, 00:18:14.878 "num_base_bdevs_discovered": 1, 00:18:14.878 "num_base_bdevs_operational": 4, 00:18:14.878 "base_bdevs_list": [ 00:18:14.878 { 00:18:14.878 "name": "pt1", 00:18:14.878 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:14.878 "is_configured": true, 00:18:14.878 "data_offset": 2048, 00:18:14.878 "data_size": 63488 00:18:14.878 }, 00:18:14.878 { 00:18:14.878 "name": null, 00:18:14.878 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:14.878 "is_configured": false, 00:18:14.878 "data_offset": 2048, 00:18:14.878 "data_size": 63488 00:18:14.878 }, 00:18:14.878 { 00:18:14.878 "name": null, 00:18:14.878 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:14.878 "is_configured": false, 00:18:14.878 "data_offset": 2048, 00:18:14.878 "data_size": 63488 00:18:14.878 }, 00:18:14.878 { 00:18:14.878 "name": null, 00:18:14.878 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:14.878 "is_configured": false, 00:18:14.878 "data_offset": 2048, 00:18:14.878 "data_size": 63488 00:18:14.878 } 00:18:14.878 ] 00:18:14.878 }' 00:18:14.878 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.878 13:41:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.444 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:15.444 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:15.444 13:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:15.444 [2024-07-15 13:41:03.014111] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:15.444 [2024-07-15 13:41:03.014154] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:15.444 [2024-07-15 13:41:03.014168] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2038270 00:18:15.444 [2024-07-15 13:41:03.014176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:15.444 [2024-07-15 13:41:03.014433] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:15.444 [2024-07-15 13:41:03.014446] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:15.444 [2024-07-15 13:41:03.014494] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:15.444 [2024-07-15 13:41:03.014508] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:15.444 pt2 00:18:15.444 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:15.444 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:15.444 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:15.704 [2024-07-15 13:41:03.190559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:15.704 [2024-07-15 13:41:03.190580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:15.704 [2024-07-15 13:41:03.190592] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e7400 00:18:15.704 [2024-07-15 13:41:03.190600] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:15.704 [2024-07-15 13:41:03.190785] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:15.704 [2024-07-15 13:41:03.190797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:15.704 [2024-07-15 13:41:03.190829] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:15.704 [2024-07-15 13:41:03.190840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:15.704 pt3 00:18:15.704 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:15.704 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:15.704 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:15.961 [2024-07-15 13:41:03.354979] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:15.961 [2024-07-15 13:41:03.355005] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:15.961 [2024-07-15 13:41:03.355016] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e5470 00:18:15.961 [2024-07-15 13:41:03.355024] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:15.961 [2024-07-15 13:41:03.355208] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:15.961 [2024-07-15 13:41:03.355220] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:15.961 [2024-07-15 13:41:03.355252] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:15.961 [2024-07-15 13:41:03.355264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:15.961 [2024-07-15 13:41:03.355345] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21e6320 00:18:15.961 [2024-07-15 13:41:03.355353] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:15.961 [2024-07-15 13:41:03.355476] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21e95f0 00:18:15.961 [2024-07-15 13:41:03.355571] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21e6320 00:18:15.961 [2024-07-15 13:41:03.355578] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21e6320 00:18:15.961 [2024-07-15 13:41:03.355643] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:15.961 pt4 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.961 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.962 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:15.962 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.962 "name": "raid_bdev1", 00:18:15.962 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:15.962 "strip_size_kb": 0, 00:18:15.962 "state": "online", 00:18:15.962 "raid_level": "raid1", 00:18:15.962 "superblock": true, 00:18:15.962 "num_base_bdevs": 4, 00:18:15.962 "num_base_bdevs_discovered": 4, 00:18:15.962 "num_base_bdevs_operational": 4, 00:18:15.962 "base_bdevs_list": [ 00:18:15.962 { 00:18:15.962 "name": "pt1", 00:18:15.962 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:15.962 "is_configured": true, 00:18:15.962 "data_offset": 2048, 00:18:15.962 "data_size": 63488 00:18:15.962 }, 00:18:15.962 { 00:18:15.962 "name": "pt2", 00:18:15.962 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:15.962 "is_configured": true, 00:18:15.962 "data_offset": 2048, 00:18:15.962 "data_size": 63488 00:18:15.962 }, 00:18:15.962 { 00:18:15.962 "name": "pt3", 00:18:15.962 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:15.962 "is_configured": true, 00:18:15.962 "data_offset": 2048, 00:18:15.962 "data_size": 63488 00:18:15.962 }, 00:18:15.962 { 00:18:15.962 "name": "pt4", 00:18:15.962 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:15.962 "is_configured": true, 00:18:15.962 "data_offset": 2048, 00:18:15.962 "data_size": 63488 00:18:15.962 } 00:18:15.962 ] 00:18:15.962 }' 00:18:15.962 13:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.962 13:41:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:16.528 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:16.788 [2024-07-15 13:41:04.209409] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:16.788 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:16.788 "name": "raid_bdev1", 00:18:16.788 "aliases": [ 00:18:16.788 "09777c47-dfe1-4520-a042-c98d153d2bc9" 00:18:16.788 ], 00:18:16.788 "product_name": "Raid Volume", 00:18:16.788 "block_size": 512, 00:18:16.788 "num_blocks": 63488, 00:18:16.788 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:16.788 "assigned_rate_limits": { 00:18:16.788 "rw_ios_per_sec": 0, 00:18:16.788 "rw_mbytes_per_sec": 0, 00:18:16.788 "r_mbytes_per_sec": 0, 00:18:16.788 "w_mbytes_per_sec": 0 00:18:16.788 }, 00:18:16.788 "claimed": false, 00:18:16.788 "zoned": false, 00:18:16.788 "supported_io_types": { 00:18:16.788 "read": true, 00:18:16.788 "write": true, 00:18:16.788 "unmap": false, 00:18:16.788 "flush": false, 00:18:16.788 "reset": true, 00:18:16.788 "nvme_admin": false, 00:18:16.788 "nvme_io": false, 00:18:16.788 "nvme_io_md": false, 00:18:16.788 "write_zeroes": true, 00:18:16.788 "zcopy": false, 00:18:16.788 "get_zone_info": false, 00:18:16.788 "zone_management": false, 00:18:16.788 "zone_append": false, 00:18:16.788 "compare": false, 00:18:16.788 "compare_and_write": false, 00:18:16.788 "abort": false, 00:18:16.788 "seek_hole": false, 00:18:16.788 "seek_data": false, 00:18:16.788 "copy": false, 00:18:16.788 "nvme_iov_md": false 00:18:16.788 }, 00:18:16.788 "memory_domains": [ 00:18:16.788 { 00:18:16.788 "dma_device_id": "system", 00:18:16.788 "dma_device_type": 1 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.788 "dma_device_type": 2 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "dma_device_id": "system", 00:18:16.788 "dma_device_type": 1 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.788 "dma_device_type": 2 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "dma_device_id": "system", 00:18:16.788 "dma_device_type": 1 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.788 "dma_device_type": 2 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "dma_device_id": "system", 00:18:16.788 "dma_device_type": 1 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.788 "dma_device_type": 2 00:18:16.788 } 00:18:16.788 ], 00:18:16.788 "driver_specific": { 00:18:16.788 "raid": { 00:18:16.788 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:16.788 "strip_size_kb": 0, 00:18:16.788 "state": "online", 00:18:16.788 "raid_level": "raid1", 00:18:16.788 "superblock": true, 00:18:16.788 "num_base_bdevs": 4, 00:18:16.788 "num_base_bdevs_discovered": 4, 00:18:16.788 "num_base_bdevs_operational": 4, 00:18:16.788 "base_bdevs_list": [ 00:18:16.788 { 00:18:16.788 "name": "pt1", 00:18:16.788 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:16.788 "is_configured": true, 00:18:16.788 "data_offset": 2048, 00:18:16.788 "data_size": 63488 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "name": "pt2", 00:18:16.788 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:16.788 "is_configured": true, 00:18:16.788 "data_offset": 2048, 00:18:16.788 "data_size": 63488 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "name": "pt3", 00:18:16.788 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:16.788 "is_configured": true, 00:18:16.788 "data_offset": 2048, 00:18:16.788 "data_size": 63488 00:18:16.788 }, 00:18:16.788 { 00:18:16.788 "name": "pt4", 00:18:16.788 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:16.788 "is_configured": true, 00:18:16.788 "data_offset": 2048, 00:18:16.788 "data_size": 63488 00:18:16.788 } 00:18:16.788 ] 00:18:16.788 } 00:18:16.788 } 00:18:16.788 }' 00:18:16.788 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:16.788 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:16.788 pt2 00:18:16.788 pt3 00:18:16.788 pt4' 00:18:16.788 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.788 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.788 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:17.047 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.047 "name": "pt1", 00:18:17.047 "aliases": [ 00:18:17.047 "00000000-0000-0000-0000-000000000001" 00:18:17.047 ], 00:18:17.047 "product_name": "passthru", 00:18:17.047 "block_size": 512, 00:18:17.047 "num_blocks": 65536, 00:18:17.047 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:17.047 "assigned_rate_limits": { 00:18:17.047 "rw_ios_per_sec": 0, 00:18:17.047 "rw_mbytes_per_sec": 0, 00:18:17.047 "r_mbytes_per_sec": 0, 00:18:17.047 "w_mbytes_per_sec": 0 00:18:17.047 }, 00:18:17.047 "claimed": true, 00:18:17.047 "claim_type": "exclusive_write", 00:18:17.047 "zoned": false, 00:18:17.047 "supported_io_types": { 00:18:17.047 "read": true, 00:18:17.047 "write": true, 00:18:17.047 "unmap": true, 00:18:17.047 "flush": true, 00:18:17.047 "reset": true, 00:18:17.047 "nvme_admin": false, 00:18:17.047 "nvme_io": false, 00:18:17.047 "nvme_io_md": false, 00:18:17.048 "write_zeroes": true, 00:18:17.048 "zcopy": true, 00:18:17.048 "get_zone_info": false, 00:18:17.048 "zone_management": false, 00:18:17.048 "zone_append": false, 00:18:17.048 "compare": false, 00:18:17.048 "compare_and_write": false, 00:18:17.048 "abort": true, 00:18:17.048 "seek_hole": false, 00:18:17.048 "seek_data": false, 00:18:17.048 "copy": true, 00:18:17.048 "nvme_iov_md": false 00:18:17.048 }, 00:18:17.048 "memory_domains": [ 00:18:17.048 { 00:18:17.048 "dma_device_id": "system", 00:18:17.048 "dma_device_type": 1 00:18:17.048 }, 00:18:17.048 { 00:18:17.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.048 "dma_device_type": 2 00:18:17.048 } 00:18:17.048 ], 00:18:17.048 "driver_specific": { 00:18:17.048 "passthru": { 00:18:17.048 "name": "pt1", 00:18:17.048 "base_bdev_name": "malloc1" 00:18:17.048 } 00:18:17.048 } 00:18:17.048 }' 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.048 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.307 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.307 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.307 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.307 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.307 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.307 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:17.307 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.565 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.565 "name": "pt2", 00:18:17.565 "aliases": [ 00:18:17.565 "00000000-0000-0000-0000-000000000002" 00:18:17.565 ], 00:18:17.565 "product_name": "passthru", 00:18:17.565 "block_size": 512, 00:18:17.565 "num_blocks": 65536, 00:18:17.565 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:17.565 "assigned_rate_limits": { 00:18:17.565 "rw_ios_per_sec": 0, 00:18:17.565 "rw_mbytes_per_sec": 0, 00:18:17.565 "r_mbytes_per_sec": 0, 00:18:17.565 "w_mbytes_per_sec": 0 00:18:17.565 }, 00:18:17.565 "claimed": true, 00:18:17.565 "claim_type": "exclusive_write", 00:18:17.565 "zoned": false, 00:18:17.565 "supported_io_types": { 00:18:17.565 "read": true, 00:18:17.565 "write": true, 00:18:17.565 "unmap": true, 00:18:17.565 "flush": true, 00:18:17.565 "reset": true, 00:18:17.565 "nvme_admin": false, 00:18:17.565 "nvme_io": false, 00:18:17.565 "nvme_io_md": false, 00:18:17.565 "write_zeroes": true, 00:18:17.565 "zcopy": true, 00:18:17.565 "get_zone_info": false, 00:18:17.565 "zone_management": false, 00:18:17.565 "zone_append": false, 00:18:17.565 "compare": false, 00:18:17.565 "compare_and_write": false, 00:18:17.565 "abort": true, 00:18:17.565 "seek_hole": false, 00:18:17.565 "seek_data": false, 00:18:17.565 "copy": true, 00:18:17.565 "nvme_iov_md": false 00:18:17.565 }, 00:18:17.565 "memory_domains": [ 00:18:17.565 { 00:18:17.565 "dma_device_id": "system", 00:18:17.565 "dma_device_type": 1 00:18:17.565 }, 00:18:17.565 { 00:18:17.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.565 "dma_device_type": 2 00:18:17.565 } 00:18:17.565 ], 00:18:17.565 "driver_specific": { 00:18:17.565 "passthru": { 00:18:17.565 "name": "pt2", 00:18:17.565 "base_bdev_name": "malloc2" 00:18:17.565 } 00:18:17.565 } 00:18:17.565 }' 00:18:17.565 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.565 13:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.565 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.823 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.823 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.823 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.823 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:17.823 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.823 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.823 "name": "pt3", 00:18:17.823 "aliases": [ 00:18:17.823 "00000000-0000-0000-0000-000000000003" 00:18:17.823 ], 00:18:17.823 "product_name": "passthru", 00:18:17.823 "block_size": 512, 00:18:17.823 "num_blocks": 65536, 00:18:17.823 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:17.823 "assigned_rate_limits": { 00:18:17.823 "rw_ios_per_sec": 0, 00:18:17.823 "rw_mbytes_per_sec": 0, 00:18:17.823 "r_mbytes_per_sec": 0, 00:18:17.823 "w_mbytes_per_sec": 0 00:18:17.823 }, 00:18:17.823 "claimed": true, 00:18:17.823 "claim_type": "exclusive_write", 00:18:17.823 "zoned": false, 00:18:17.823 "supported_io_types": { 00:18:17.823 "read": true, 00:18:17.823 "write": true, 00:18:17.823 "unmap": true, 00:18:17.823 "flush": true, 00:18:17.823 "reset": true, 00:18:17.823 "nvme_admin": false, 00:18:17.824 "nvme_io": false, 00:18:17.824 "nvme_io_md": false, 00:18:17.824 "write_zeroes": true, 00:18:17.824 "zcopy": true, 00:18:17.824 "get_zone_info": false, 00:18:17.824 "zone_management": false, 00:18:17.824 "zone_append": false, 00:18:17.824 "compare": false, 00:18:17.824 "compare_and_write": false, 00:18:17.824 "abort": true, 00:18:17.824 "seek_hole": false, 00:18:17.824 "seek_data": false, 00:18:17.824 "copy": true, 00:18:17.824 "nvme_iov_md": false 00:18:17.824 }, 00:18:17.824 "memory_domains": [ 00:18:17.824 { 00:18:17.824 "dma_device_id": "system", 00:18:17.824 "dma_device_type": 1 00:18:17.824 }, 00:18:17.824 { 00:18:17.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.824 "dma_device_type": 2 00:18:17.824 } 00:18:17.824 ], 00:18:17.824 "driver_specific": { 00:18:17.824 "passthru": { 00:18:17.824 "name": "pt3", 00:18:17.824 "base_bdev_name": "malloc3" 00:18:17.824 } 00:18:17.824 } 00:18:17.824 }' 00:18:17.824 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.082 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.340 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.340 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.340 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:18.340 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.340 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.340 "name": "pt4", 00:18:18.340 "aliases": [ 00:18:18.340 "00000000-0000-0000-0000-000000000004" 00:18:18.340 ], 00:18:18.340 "product_name": "passthru", 00:18:18.340 "block_size": 512, 00:18:18.340 "num_blocks": 65536, 00:18:18.340 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:18.340 "assigned_rate_limits": { 00:18:18.340 "rw_ios_per_sec": 0, 00:18:18.340 "rw_mbytes_per_sec": 0, 00:18:18.340 "r_mbytes_per_sec": 0, 00:18:18.340 "w_mbytes_per_sec": 0 00:18:18.340 }, 00:18:18.340 "claimed": true, 00:18:18.340 "claim_type": "exclusive_write", 00:18:18.340 "zoned": false, 00:18:18.340 "supported_io_types": { 00:18:18.340 "read": true, 00:18:18.340 "write": true, 00:18:18.340 "unmap": true, 00:18:18.340 "flush": true, 00:18:18.340 "reset": true, 00:18:18.340 "nvme_admin": false, 00:18:18.340 "nvme_io": false, 00:18:18.340 "nvme_io_md": false, 00:18:18.340 "write_zeroes": true, 00:18:18.340 "zcopy": true, 00:18:18.340 "get_zone_info": false, 00:18:18.340 "zone_management": false, 00:18:18.340 "zone_append": false, 00:18:18.340 "compare": false, 00:18:18.340 "compare_and_write": false, 00:18:18.340 "abort": true, 00:18:18.340 "seek_hole": false, 00:18:18.340 "seek_data": false, 00:18:18.340 "copy": true, 00:18:18.340 "nvme_iov_md": false 00:18:18.340 }, 00:18:18.340 "memory_domains": [ 00:18:18.340 { 00:18:18.340 "dma_device_id": "system", 00:18:18.340 "dma_device_type": 1 00:18:18.340 }, 00:18:18.340 { 00:18:18.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.340 "dma_device_type": 2 00:18:18.340 } 00:18:18.340 ], 00:18:18.340 "driver_specific": { 00:18:18.340 "passthru": { 00:18:18.340 "name": "pt4", 00:18:18.340 "base_bdev_name": "malloc4" 00:18:18.340 } 00:18:18.340 } 00:18:18.340 }' 00:18:18.340 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.340 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.599 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.599 13:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:18.599 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:18.858 [2024-07-15 13:41:06.350935] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:18.858 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 09777c47-dfe1-4520-a042-c98d153d2bc9 '!=' 09777c47-dfe1-4520-a042-c98d153d2bc9 ']' 00:18:18.858 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:18:18.858 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:18.858 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:18.859 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:19.118 [2024-07-15 13:41:06.531205] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.118 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.377 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:19.377 "name": "raid_bdev1", 00:18:19.377 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:19.377 "strip_size_kb": 0, 00:18:19.377 "state": "online", 00:18:19.377 "raid_level": "raid1", 00:18:19.377 "superblock": true, 00:18:19.377 "num_base_bdevs": 4, 00:18:19.377 "num_base_bdevs_discovered": 3, 00:18:19.377 "num_base_bdevs_operational": 3, 00:18:19.377 "base_bdevs_list": [ 00:18:19.377 { 00:18:19.377 "name": null, 00:18:19.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:19.377 "is_configured": false, 00:18:19.377 "data_offset": 2048, 00:18:19.377 "data_size": 63488 00:18:19.377 }, 00:18:19.377 { 00:18:19.377 "name": "pt2", 00:18:19.377 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:19.377 "is_configured": true, 00:18:19.377 "data_offset": 2048, 00:18:19.377 "data_size": 63488 00:18:19.377 }, 00:18:19.377 { 00:18:19.377 "name": "pt3", 00:18:19.377 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:19.377 "is_configured": true, 00:18:19.377 "data_offset": 2048, 00:18:19.377 "data_size": 63488 00:18:19.377 }, 00:18:19.377 { 00:18:19.377 "name": "pt4", 00:18:19.377 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:19.377 "is_configured": true, 00:18:19.377 "data_offset": 2048, 00:18:19.377 "data_size": 63488 00:18:19.377 } 00:18:19.377 ] 00:18:19.377 }' 00:18:19.377 13:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:19.377 13:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.636 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:19.895 [2024-07-15 13:41:07.385398] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:19.895 [2024-07-15 13:41:07.385423] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:19.895 [2024-07-15 13:41:07.385462] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:19.895 [2024-07-15 13:41:07.385511] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:19.895 [2024-07-15 13:41:07.385519] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21e6320 name raid_bdev1, state offline 00:18:19.895 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.895 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:20.153 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:20.412 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:20.412 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:20.412 13:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:20.671 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:20.671 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:20.671 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:18:20.671 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:20.672 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:20.931 [2024-07-15 13:41:08.291699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:20.931 [2024-07-15 13:41:08.291737] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:20.931 [2024-07-15 13:41:08.291750] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e92f0 00:18:20.931 [2024-07-15 13:41:08.291758] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:20.931 [2024-07-15 13:41:08.292979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:20.931 [2024-07-15 13:41:08.293011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:20.931 [2024-07-15 13:41:08.293064] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:20.931 [2024-07-15 13:41:08.293085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:20.931 pt2 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.931 "name": "raid_bdev1", 00:18:20.931 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:20.931 "strip_size_kb": 0, 00:18:20.931 "state": "configuring", 00:18:20.931 "raid_level": "raid1", 00:18:20.931 "superblock": true, 00:18:20.931 "num_base_bdevs": 4, 00:18:20.931 "num_base_bdevs_discovered": 1, 00:18:20.931 "num_base_bdevs_operational": 3, 00:18:20.931 "base_bdevs_list": [ 00:18:20.931 { 00:18:20.931 "name": null, 00:18:20.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.931 "is_configured": false, 00:18:20.931 "data_offset": 2048, 00:18:20.931 "data_size": 63488 00:18:20.931 }, 00:18:20.931 { 00:18:20.931 "name": "pt2", 00:18:20.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:20.931 "is_configured": true, 00:18:20.931 "data_offset": 2048, 00:18:20.931 "data_size": 63488 00:18:20.931 }, 00:18:20.931 { 00:18:20.931 "name": null, 00:18:20.931 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:20.931 "is_configured": false, 00:18:20.931 "data_offset": 2048, 00:18:20.931 "data_size": 63488 00:18:20.931 }, 00:18:20.931 { 00:18:20.931 "name": null, 00:18:20.931 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:20.931 "is_configured": false, 00:18:20.931 "data_offset": 2048, 00:18:20.931 "data_size": 63488 00:18:20.931 } 00:18:20.931 ] 00:18:20.931 }' 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.931 13:41:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:21.499 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:21.499 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:21.499 13:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:21.759 [2024-07-15 13:41:09.118078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:21.759 [2024-07-15 13:41:09.118117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.759 [2024-07-15 13:41:09.118132] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2039e80 00:18:21.759 [2024-07-15 13:41:09.118140] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.759 [2024-07-15 13:41:09.118403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.759 [2024-07-15 13:41:09.118418] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:21.759 [2024-07-15 13:41:09.118464] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:21.759 [2024-07-15 13:41:09.118477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:21.759 pt3 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.759 "name": "raid_bdev1", 00:18:21.759 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:21.759 "strip_size_kb": 0, 00:18:21.759 "state": "configuring", 00:18:21.759 "raid_level": "raid1", 00:18:21.759 "superblock": true, 00:18:21.759 "num_base_bdevs": 4, 00:18:21.759 "num_base_bdevs_discovered": 2, 00:18:21.759 "num_base_bdevs_operational": 3, 00:18:21.759 "base_bdevs_list": [ 00:18:21.759 { 00:18:21.759 "name": null, 00:18:21.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.759 "is_configured": false, 00:18:21.759 "data_offset": 2048, 00:18:21.759 "data_size": 63488 00:18:21.759 }, 00:18:21.759 { 00:18:21.759 "name": "pt2", 00:18:21.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:21.759 "is_configured": true, 00:18:21.759 "data_offset": 2048, 00:18:21.759 "data_size": 63488 00:18:21.759 }, 00:18:21.759 { 00:18:21.759 "name": "pt3", 00:18:21.759 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:21.759 "is_configured": true, 00:18:21.759 "data_offset": 2048, 00:18:21.759 "data_size": 63488 00:18:21.759 }, 00:18:21.759 { 00:18:21.759 "name": null, 00:18:21.759 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:21.759 "is_configured": false, 00:18:21.759 "data_offset": 2048, 00:18:21.759 "data_size": 63488 00:18:21.759 } 00:18:21.759 ] 00:18:21.759 }' 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.759 13:41:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.327 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:22.327 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:22.327 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:18:22.327 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:22.595 [2024-07-15 13:41:09.972286] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:22.595 [2024-07-15 13:41:09.972329] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.595 [2024-07-15 13:41:09.972345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e8a40 00:18:22.595 [2024-07-15 13:41:09.972354] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.595 [2024-07-15 13:41:09.972630] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.595 [2024-07-15 13:41:09.972643] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:22.595 [2024-07-15 13:41:09.972692] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:22.595 [2024-07-15 13:41:09.972707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:22.596 [2024-07-15 13:41:09.972792] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2038190 00:18:22.596 [2024-07-15 13:41:09.972799] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:22.596 [2024-07-15 13:41:09.972915] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20cfe30 00:18:22.596 [2024-07-15 13:41:09.973016] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2038190 00:18:22.596 [2024-07-15 13:41:09.973024] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2038190 00:18:22.596 [2024-07-15 13:41:09.973094] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:22.596 pt4 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:22.596 13:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.596 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.596 "name": "raid_bdev1", 00:18:22.596 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:22.596 "strip_size_kb": 0, 00:18:22.596 "state": "online", 00:18:22.596 "raid_level": "raid1", 00:18:22.596 "superblock": true, 00:18:22.596 "num_base_bdevs": 4, 00:18:22.596 "num_base_bdevs_discovered": 3, 00:18:22.596 "num_base_bdevs_operational": 3, 00:18:22.596 "base_bdevs_list": [ 00:18:22.596 { 00:18:22.596 "name": null, 00:18:22.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.596 "is_configured": false, 00:18:22.596 "data_offset": 2048, 00:18:22.596 "data_size": 63488 00:18:22.596 }, 00:18:22.596 { 00:18:22.596 "name": "pt2", 00:18:22.596 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:22.596 "is_configured": true, 00:18:22.596 "data_offset": 2048, 00:18:22.596 "data_size": 63488 00:18:22.596 }, 00:18:22.596 { 00:18:22.596 "name": "pt3", 00:18:22.596 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:22.596 "is_configured": true, 00:18:22.596 "data_offset": 2048, 00:18:22.596 "data_size": 63488 00:18:22.596 }, 00:18:22.596 { 00:18:22.596 "name": "pt4", 00:18:22.596 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:22.596 "is_configured": true, 00:18:22.596 "data_offset": 2048, 00:18:22.596 "data_size": 63488 00:18:22.596 } 00:18:22.596 ] 00:18:22.596 }' 00:18:22.596 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.596 13:41:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.161 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:23.420 [2024-07-15 13:41:10.790404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:23.420 [2024-07-15 13:41:10.790430] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:23.420 [2024-07-15 13:41:10.790474] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:23.420 [2024-07-15 13:41:10.790524] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:23.420 [2024-07-15 13:41:10.790532] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2038190 name raid_bdev1, state offline 00:18:23.420 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.420 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:18:23.420 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:18:23.420 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:18:23.420 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:18:23.420 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:18:23.420 13:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:23.680 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:23.941 [2024-07-15 13:41:11.327784] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:23.941 [2024-07-15 13:41:11.327828] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:23.941 [2024-07-15 13:41:11.327843] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e9ae0 00:18:23.941 [2024-07-15 13:41:11.327851] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:23.941 [2024-07-15 13:41:11.329074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:23.941 [2024-07-15 13:41:11.329099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:23.941 [2024-07-15 13:41:11.329153] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:23.941 [2024-07-15 13:41:11.329173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:23.941 [2024-07-15 13:41:11.329248] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:23.941 [2024-07-15 13:41:11.329257] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:23.941 [2024-07-15 13:41:11.329268] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d0210 name raid_bdev1, state configuring 00:18:23.941 [2024-07-15 13:41:11.329285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:23.941 [2024-07-15 13:41:11.329338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:23.941 pt1 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:23.941 "name": "raid_bdev1", 00:18:23.941 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:23.941 "strip_size_kb": 0, 00:18:23.941 "state": "configuring", 00:18:23.941 "raid_level": "raid1", 00:18:23.941 "superblock": true, 00:18:23.941 "num_base_bdevs": 4, 00:18:23.941 "num_base_bdevs_discovered": 2, 00:18:23.941 "num_base_bdevs_operational": 3, 00:18:23.941 "base_bdevs_list": [ 00:18:23.941 { 00:18:23.941 "name": null, 00:18:23.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.941 "is_configured": false, 00:18:23.941 "data_offset": 2048, 00:18:23.941 "data_size": 63488 00:18:23.941 }, 00:18:23.941 { 00:18:23.941 "name": "pt2", 00:18:23.941 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:23.941 "is_configured": true, 00:18:23.941 "data_offset": 2048, 00:18:23.941 "data_size": 63488 00:18:23.941 }, 00:18:23.941 { 00:18:23.941 "name": "pt3", 00:18:23.941 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:23.941 "is_configured": true, 00:18:23.941 "data_offset": 2048, 00:18:23.941 "data_size": 63488 00:18:23.941 }, 00:18:23.941 { 00:18:23.941 "name": null, 00:18:23.941 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:23.941 "is_configured": false, 00:18:23.941 "data_offset": 2048, 00:18:23.941 "data_size": 63488 00:18:23.941 } 00:18:23.941 ] 00:18:23.941 }' 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:23.941 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:24.510 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:24.510 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:24.768 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:24.769 [2024-07-15 13:41:12.358463] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:24.769 [2024-07-15 13:41:12.358513] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:24.769 [2024-07-15 13:41:12.358529] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e84b0 00:18:24.769 [2024-07-15 13:41:12.358538] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:24.769 [2024-07-15 13:41:12.358816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:24.769 [2024-07-15 13:41:12.358830] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:24.769 [2024-07-15 13:41:12.358882] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:24.769 [2024-07-15 13:41:12.358898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:24.769 [2024-07-15 13:41:12.358986] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2039490 00:18:24.769 [2024-07-15 13:41:12.359004] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:24.769 [2024-07-15 13:41:12.359127] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20cfd50 00:18:24.769 [2024-07-15 13:41:12.359231] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2039490 00:18:24.769 [2024-07-15 13:41:12.359238] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2039490 00:18:24.769 [2024-07-15 13:41:12.359306] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:24.769 pt4 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.769 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:25.028 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.028 "name": "raid_bdev1", 00:18:25.028 "uuid": "09777c47-dfe1-4520-a042-c98d153d2bc9", 00:18:25.028 "strip_size_kb": 0, 00:18:25.028 "state": "online", 00:18:25.028 "raid_level": "raid1", 00:18:25.028 "superblock": true, 00:18:25.028 "num_base_bdevs": 4, 00:18:25.028 "num_base_bdevs_discovered": 3, 00:18:25.028 "num_base_bdevs_operational": 3, 00:18:25.028 "base_bdevs_list": [ 00:18:25.028 { 00:18:25.028 "name": null, 00:18:25.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.028 "is_configured": false, 00:18:25.028 "data_offset": 2048, 00:18:25.028 "data_size": 63488 00:18:25.028 }, 00:18:25.028 { 00:18:25.028 "name": "pt2", 00:18:25.028 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:25.028 "is_configured": true, 00:18:25.028 "data_offset": 2048, 00:18:25.028 "data_size": 63488 00:18:25.028 }, 00:18:25.028 { 00:18:25.028 "name": "pt3", 00:18:25.028 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:25.028 "is_configured": true, 00:18:25.028 "data_offset": 2048, 00:18:25.028 "data_size": 63488 00:18:25.028 }, 00:18:25.028 { 00:18:25.028 "name": "pt4", 00:18:25.028 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:25.028 "is_configured": true, 00:18:25.028 "data_offset": 2048, 00:18:25.028 "data_size": 63488 00:18:25.028 } 00:18:25.028 ] 00:18:25.028 }' 00:18:25.028 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.028 13:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.597 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:25.597 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:25.597 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:18:25.597 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:25.597 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:18:25.855 [2024-07-15 13:41:13.341181] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 09777c47-dfe1-4520-a042-c98d153d2bc9 '!=' 09777c47-dfe1-4520-a042-c98d153d2bc9 ']' 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 53025 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 53025 ']' 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 53025 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 53025 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 53025' 00:18:25.855 killing process with pid 53025 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 53025 00:18:25.855 [2024-07-15 13:41:13.413601] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:25.855 [2024-07-15 13:41:13.413645] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:25.855 [2024-07-15 13:41:13.413697] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:25.855 [2024-07-15 13:41:13.413706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2039490 name raid_bdev1, state offline 00:18:25.855 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 53025 00:18:25.855 [2024-07-15 13:41:13.450001] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:26.114 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:26.114 00:18:26.114 real 0m19.738s 00:18:26.114 user 0m35.800s 00:18:26.114 sys 0m3.864s 00:18:26.114 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:26.114 13:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.114 ************************************ 00:18:26.114 END TEST raid_superblock_test 00:18:26.114 ************************************ 00:18:26.114 13:41:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:26.114 13:41:13 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:18:26.114 13:41:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:26.114 13:41:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:26.114 13:41:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:26.373 ************************************ 00:18:26.373 START TEST raid_read_error_test 00:18:26.373 ************************************ 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.4Ao31ZVSMP 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=56187 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 56187 /var/tmp/spdk-raid.sock 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 56187 ']' 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:26.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:26.373 13:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.373 [2024-07-15 13:41:13.819053] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:18:26.373 [2024-07-15 13:41:13.819114] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid56187 ] 00:18:26.373 [2024-07-15 13:41:13.906587] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:26.632 [2024-07-15 13:41:13.994629] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:26.632 [2024-07-15 13:41:14.049497] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:26.632 [2024-07-15 13:41:14.049528] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:27.200 13:41:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:27.200 13:41:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:27.200 13:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:27.200 13:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:27.200 BaseBdev1_malloc 00:18:27.200 13:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:27.460 true 00:18:27.460 13:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:27.773 [2024-07-15 13:41:15.146508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:27.773 [2024-07-15 13:41:15.146546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.773 [2024-07-15 13:41:15.146563] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1847990 00:18:27.773 [2024-07-15 13:41:15.146571] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.773 [2024-07-15 13:41:15.147736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.773 [2024-07-15 13:41:15.147760] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:27.773 BaseBdev1 00:18:27.773 13:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:27.773 13:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:27.774 BaseBdev2_malloc 00:18:27.774 13:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:28.047 true 00:18:28.047 13:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:28.306 [2024-07-15 13:41:15.671538] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:28.306 [2024-07-15 13:41:15.671572] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.306 [2024-07-15 13:41:15.671586] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184c1d0 00:18:28.306 [2024-07-15 13:41:15.671594] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.306 [2024-07-15 13:41:15.672542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.306 [2024-07-15 13:41:15.672564] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:28.306 BaseBdev2 00:18:28.306 13:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:28.306 13:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:28.306 BaseBdev3_malloc 00:18:28.306 13:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:28.564 true 00:18:28.564 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:28.824 [2024-07-15 13:41:16.200687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:28.824 [2024-07-15 13:41:16.200729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.824 [2024-07-15 13:41:16.200743] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184e490 00:18:28.824 [2024-07-15 13:41:16.200752] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.824 [2024-07-15 13:41:16.201819] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.824 [2024-07-15 13:41:16.201844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:28.824 BaseBdev3 00:18:28.824 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:28.824 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:28.824 BaseBdev4_malloc 00:18:28.824 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:29.083 true 00:18:29.083 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:29.342 [2024-07-15 13:41:16.734000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:29.342 [2024-07-15 13:41:16.734057] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:29.342 [2024-07-15 13:41:16.734072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184f360 00:18:29.342 [2024-07-15 13:41:16.734081] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:29.342 [2024-07-15 13:41:16.735077] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:29.342 [2024-07-15 13:41:16.735099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:29.342 BaseBdev4 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:29.342 [2024-07-15 13:41:16.906525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:29.342 [2024-07-15 13:41:16.907313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:29.342 [2024-07-15 13:41:16.907357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:29.342 [2024-07-15 13:41:16.907397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:29.342 [2024-07-15 13:41:16.907555] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18494e0 00:18:29.342 [2024-07-15 13:41:16.907562] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:29.342 [2024-07-15 13:41:16.907679] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169db20 00:18:29.342 [2024-07-15 13:41:16.907781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18494e0 00:18:29.342 [2024-07-15 13:41:16.907787] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18494e0 00:18:29.342 [2024-07-15 13:41:16.907851] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.342 13:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.600 13:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.600 "name": "raid_bdev1", 00:18:29.600 "uuid": "272c7dcd-fb60-428a-a00e-78c4c034c6c3", 00:18:29.600 "strip_size_kb": 0, 00:18:29.600 "state": "online", 00:18:29.600 "raid_level": "raid1", 00:18:29.600 "superblock": true, 00:18:29.600 "num_base_bdevs": 4, 00:18:29.600 "num_base_bdevs_discovered": 4, 00:18:29.600 "num_base_bdevs_operational": 4, 00:18:29.600 "base_bdevs_list": [ 00:18:29.600 { 00:18:29.600 "name": "BaseBdev1", 00:18:29.600 "uuid": "57054dac-b880-54b8-837a-ec49c1704d55", 00:18:29.600 "is_configured": true, 00:18:29.600 "data_offset": 2048, 00:18:29.600 "data_size": 63488 00:18:29.600 }, 00:18:29.600 { 00:18:29.600 "name": "BaseBdev2", 00:18:29.600 "uuid": "7da011e0-23d1-517a-81ae-fa4374f554ef", 00:18:29.600 "is_configured": true, 00:18:29.600 "data_offset": 2048, 00:18:29.600 "data_size": 63488 00:18:29.600 }, 00:18:29.600 { 00:18:29.600 "name": "BaseBdev3", 00:18:29.600 "uuid": "87d4e48c-76b9-5c28-b1d5-6eb2297896a1", 00:18:29.600 "is_configured": true, 00:18:29.600 "data_offset": 2048, 00:18:29.600 "data_size": 63488 00:18:29.600 }, 00:18:29.600 { 00:18:29.600 "name": "BaseBdev4", 00:18:29.600 "uuid": "e7e44c78-5a5f-5ce8-af50-c7da594f57d5", 00:18:29.600 "is_configured": true, 00:18:29.600 "data_offset": 2048, 00:18:29.600 "data_size": 63488 00:18:29.600 } 00:18:29.600 ] 00:18:29.600 }' 00:18:29.600 13:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.600 13:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:30.168 13:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:30.168 13:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:30.168 [2024-07-15 13:41:17.664720] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169d520 00:18:31.103 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.362 "name": "raid_bdev1", 00:18:31.362 "uuid": "272c7dcd-fb60-428a-a00e-78c4c034c6c3", 00:18:31.362 "strip_size_kb": 0, 00:18:31.362 "state": "online", 00:18:31.362 "raid_level": "raid1", 00:18:31.362 "superblock": true, 00:18:31.362 "num_base_bdevs": 4, 00:18:31.362 "num_base_bdevs_discovered": 4, 00:18:31.362 "num_base_bdevs_operational": 4, 00:18:31.362 "base_bdevs_list": [ 00:18:31.362 { 00:18:31.362 "name": "BaseBdev1", 00:18:31.362 "uuid": "57054dac-b880-54b8-837a-ec49c1704d55", 00:18:31.362 "is_configured": true, 00:18:31.362 "data_offset": 2048, 00:18:31.362 "data_size": 63488 00:18:31.362 }, 00:18:31.362 { 00:18:31.362 "name": "BaseBdev2", 00:18:31.362 "uuid": "7da011e0-23d1-517a-81ae-fa4374f554ef", 00:18:31.362 "is_configured": true, 00:18:31.362 "data_offset": 2048, 00:18:31.362 "data_size": 63488 00:18:31.362 }, 00:18:31.362 { 00:18:31.362 "name": "BaseBdev3", 00:18:31.362 "uuid": "87d4e48c-76b9-5c28-b1d5-6eb2297896a1", 00:18:31.362 "is_configured": true, 00:18:31.362 "data_offset": 2048, 00:18:31.362 "data_size": 63488 00:18:31.362 }, 00:18:31.362 { 00:18:31.362 "name": "BaseBdev4", 00:18:31.362 "uuid": "e7e44c78-5a5f-5ce8-af50-c7da594f57d5", 00:18:31.362 "is_configured": true, 00:18:31.362 "data_offset": 2048, 00:18:31.362 "data_size": 63488 00:18:31.362 } 00:18:31.362 ] 00:18:31.362 }' 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.362 13:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.929 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:32.188 [2024-07-15 13:41:19.577506] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:32.188 [2024-07-15 13:41:19.577548] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:32.188 [2024-07-15 13:41:19.579571] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:32.188 [2024-07-15 13:41:19.579598] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.188 [2024-07-15 13:41:19.579674] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:32.188 [2024-07-15 13:41:19.579682] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18494e0 name raid_bdev1, state offline 00:18:32.188 0 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 56187 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 56187 ']' 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 56187 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 56187 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 56187' 00:18:32.188 killing process with pid 56187 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 56187 00:18:32.188 [2024-07-15 13:41:19.633019] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:32.188 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 56187 00:18:32.188 [2024-07-15 13:41:19.661262] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.4Ao31ZVSMP 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:32.446 00:18:32.446 real 0m6.117s 00:18:32.446 user 0m9.423s 00:18:32.446 sys 0m1.128s 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:32.446 13:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.446 ************************************ 00:18:32.446 END TEST raid_read_error_test 00:18:32.446 ************************************ 00:18:32.446 13:41:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:32.446 13:41:19 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:18:32.446 13:41:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:32.446 13:41:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:32.446 13:41:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:32.446 ************************************ 00:18:32.446 START TEST raid_write_error_test 00:18:32.446 ************************************ 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.o1ciXKRUrZ 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=57011 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 57011 /var/tmp/spdk-raid.sock 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 57011 ']' 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:32.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:32.446 13:41:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.446 [2024-07-15 13:41:20.005121] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:18:32.446 [2024-07-15 13:41:20.005173] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid57011 ] 00:18:32.704 [2024-07-15 13:41:20.097644] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:32.704 [2024-07-15 13:41:20.188364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:32.704 [2024-07-15 13:41:20.252633] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:32.704 [2024-07-15 13:41:20.252667] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:33.270 13:41:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:33.270 13:41:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:33.270 13:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:33.270 13:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:33.528 BaseBdev1_malloc 00:18:33.528 13:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:33.785 true 00:18:33.785 13:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:33.785 [2024-07-15 13:41:21.322309] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:33.785 [2024-07-15 13:41:21.322348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:33.785 [2024-07-15 13:41:21.322379] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f4990 00:18:33.785 [2024-07-15 13:41:21.322387] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:33.785 [2024-07-15 13:41:21.323720] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:33.785 [2024-07-15 13:41:21.323744] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:33.785 BaseBdev1 00:18:33.785 13:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:33.785 13:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:34.042 BaseBdev2_malloc 00:18:34.042 13:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:34.042 true 00:18:34.299 13:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:34.299 [2024-07-15 13:41:21.820054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:34.299 [2024-07-15 13:41:21.820088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:34.299 [2024-07-15 13:41:21.820121] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f91d0 00:18:34.299 [2024-07-15 13:41:21.820130] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:34.299 [2024-07-15 13:41:21.821340] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:34.299 [2024-07-15 13:41:21.821361] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:34.299 BaseBdev2 00:18:34.299 13:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:34.299 13:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:34.556 BaseBdev3_malloc 00:18:34.556 13:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:34.812 true 00:18:34.812 13:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:34.812 [2024-07-15 13:41:22.342312] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:34.812 [2024-07-15 13:41:22.342345] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:34.812 [2024-07-15 13:41:22.342381] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22fb490 00:18:34.812 [2024-07-15 13:41:22.342390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:34.812 [2024-07-15 13:41:22.343553] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:34.812 [2024-07-15 13:41:22.343577] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:34.812 BaseBdev3 00:18:34.812 13:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:34.812 13:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:35.069 BaseBdev4_malloc 00:18:35.069 13:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:35.326 true 00:18:35.326 13:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:35.327 [2024-07-15 13:41:22.855987] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:35.327 [2024-07-15 13:41:22.856043] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.327 [2024-07-15 13:41:22.856059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22fc360 00:18:35.327 [2024-07-15 13:41:22.856068] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.327 [2024-07-15 13:41:22.857205] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.327 [2024-07-15 13:41:22.857229] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:35.327 BaseBdev4 00:18:35.327 13:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:35.584 [2024-07-15 13:41:23.028459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:35.584 [2024-07-15 13:41:23.029408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:35.584 [2024-07-15 13:41:23.029456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:35.584 [2024-07-15 13:41:23.029495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:35.584 [2024-07-15 13:41:23.029653] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22f64e0 00:18:35.584 [2024-07-15 13:41:23.029661] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:35.584 [2024-07-15 13:41:23.029799] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x214ab20 00:18:35.584 [2024-07-15 13:41:23.029905] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22f64e0 00:18:35.584 [2024-07-15 13:41:23.029912] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22f64e0 00:18:35.584 [2024-07-15 13:41:23.029984] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:35.584 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.841 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.841 "name": "raid_bdev1", 00:18:35.841 "uuid": "f49fda4f-605b-409f-8457-16fbd8d5fa80", 00:18:35.841 "strip_size_kb": 0, 00:18:35.841 "state": "online", 00:18:35.841 "raid_level": "raid1", 00:18:35.841 "superblock": true, 00:18:35.841 "num_base_bdevs": 4, 00:18:35.841 "num_base_bdevs_discovered": 4, 00:18:35.841 "num_base_bdevs_operational": 4, 00:18:35.841 "base_bdevs_list": [ 00:18:35.841 { 00:18:35.841 "name": "BaseBdev1", 00:18:35.841 "uuid": "3a88b27c-8358-5494-8f69-2fd5d9a2eb00", 00:18:35.841 "is_configured": true, 00:18:35.841 "data_offset": 2048, 00:18:35.841 "data_size": 63488 00:18:35.841 }, 00:18:35.841 { 00:18:35.841 "name": "BaseBdev2", 00:18:35.841 "uuid": "076c3e30-1b05-5706-90e5-018fdfe1e93d", 00:18:35.841 "is_configured": true, 00:18:35.841 "data_offset": 2048, 00:18:35.841 "data_size": 63488 00:18:35.841 }, 00:18:35.841 { 00:18:35.841 "name": "BaseBdev3", 00:18:35.841 "uuid": "2a44f10d-0677-566c-b5ef-e3e134049d73", 00:18:35.841 "is_configured": true, 00:18:35.841 "data_offset": 2048, 00:18:35.841 "data_size": 63488 00:18:35.841 }, 00:18:35.841 { 00:18:35.841 "name": "BaseBdev4", 00:18:35.841 "uuid": "f01b12b4-52b3-5627-a7f8-f9a40afa7159", 00:18:35.841 "is_configured": true, 00:18:35.841 "data_offset": 2048, 00:18:35.841 "data_size": 63488 00:18:35.841 } 00:18:35.841 ] 00:18:35.841 }' 00:18:35.841 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.841 13:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:36.098 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:36.098 13:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:36.356 [2024-07-15 13:41:23.766577] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x214a520 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:37.290 [2024-07-15 13:41:24.850188] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:37.290 [2024-07-15 13:41:24.850239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:37.290 [2024-07-15 13:41:24.850437] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x214a520 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.290 13:41:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.548 13:41:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.548 "name": "raid_bdev1", 00:18:37.548 "uuid": "f49fda4f-605b-409f-8457-16fbd8d5fa80", 00:18:37.548 "strip_size_kb": 0, 00:18:37.548 "state": "online", 00:18:37.548 "raid_level": "raid1", 00:18:37.548 "superblock": true, 00:18:37.548 "num_base_bdevs": 4, 00:18:37.548 "num_base_bdevs_discovered": 3, 00:18:37.548 "num_base_bdevs_operational": 3, 00:18:37.548 "base_bdevs_list": [ 00:18:37.548 { 00:18:37.548 "name": null, 00:18:37.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:37.548 "is_configured": false, 00:18:37.548 "data_offset": 2048, 00:18:37.548 "data_size": 63488 00:18:37.548 }, 00:18:37.548 { 00:18:37.548 "name": "BaseBdev2", 00:18:37.548 "uuid": "076c3e30-1b05-5706-90e5-018fdfe1e93d", 00:18:37.548 "is_configured": true, 00:18:37.548 "data_offset": 2048, 00:18:37.548 "data_size": 63488 00:18:37.548 }, 00:18:37.548 { 00:18:37.548 "name": "BaseBdev3", 00:18:37.548 "uuid": "2a44f10d-0677-566c-b5ef-e3e134049d73", 00:18:37.548 "is_configured": true, 00:18:37.548 "data_offset": 2048, 00:18:37.548 "data_size": 63488 00:18:37.548 }, 00:18:37.548 { 00:18:37.548 "name": "BaseBdev4", 00:18:37.548 "uuid": "f01b12b4-52b3-5627-a7f8-f9a40afa7159", 00:18:37.548 "is_configured": true, 00:18:37.548 "data_offset": 2048, 00:18:37.548 "data_size": 63488 00:18:37.548 } 00:18:37.548 ] 00:18:37.548 }' 00:18:37.548 13:41:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.548 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.115 13:41:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:38.115 [2024-07-15 13:41:25.719493] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:38.115 [2024-07-15 13:41:25.719531] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:38.115 [2024-07-15 13:41:25.721585] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:38.115 [2024-07-15 13:41:25.721610] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:38.115 [2024-07-15 13:41:25.721674] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:38.115 [2024-07-15 13:41:25.721681] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22f64e0 name raid_bdev1, state offline 00:18:38.115 0 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 57011 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 57011 ']' 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 57011 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 57011 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 57011' 00:18:38.375 killing process with pid 57011 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 57011 00:18:38.375 [2024-07-15 13:41:25.782673] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:38.375 13:41:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 57011 00:18:38.375 [2024-07-15 13:41:25.811732] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:38.634 13:41:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.o1ciXKRUrZ 00:18:38.634 13:41:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:38.634 13:41:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:38.634 13:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:38.634 13:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:38.634 13:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:38.634 13:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:38.634 13:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:38.634 00:18:38.634 real 0m6.075s 00:18:38.634 user 0m9.380s 00:18:38.634 sys 0m1.072s 00:18:38.634 13:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:38.634 13:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.634 ************************************ 00:18:38.634 END TEST raid_write_error_test 00:18:38.634 ************************************ 00:18:38.634 13:41:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:38.634 13:41:26 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:18:38.634 13:41:26 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:18:38.634 13:41:26 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:18:38.634 13:41:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:18:38.634 13:41:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:38.634 13:41:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:38.634 ************************************ 00:18:38.634 START TEST raid_rebuild_test 00:18:38.634 ************************************ 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:18:38.634 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=57978 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 57978 /var/tmp/spdk-raid.sock 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 57978 ']' 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:38.635 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:38.635 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.635 [2024-07-15 13:41:26.140546] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:18:38.635 [2024-07-15 13:41:26.140589] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid57978 ] 00:18:38.635 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:38.635 Zero copy mechanism will not be used. 00:18:38.635 [2024-07-15 13:41:26.225361] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:38.894 [2024-07-15 13:41:26.313860] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:38.894 [2024-07-15 13:41:26.370166] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:38.894 [2024-07-15 13:41:26.370199] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:39.462 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:39.462 13:41:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:18:39.462 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:39.462 13:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:39.721 BaseBdev1_malloc 00:18:39.721 13:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:39.721 [2024-07-15 13:41:27.277404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:39.721 [2024-07-15 13:41:27.277442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:39.721 [2024-07-15 13:41:27.277474] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1beb600 00:18:39.721 [2024-07-15 13:41:27.277484] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:39.721 [2024-07-15 13:41:27.278822] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:39.721 [2024-07-15 13:41:27.278845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:39.721 BaseBdev1 00:18:39.721 13:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:39.721 13:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:39.979 BaseBdev2_malloc 00:18:39.979 13:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:40.238 [2024-07-15 13:41:27.622957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:40.238 [2024-07-15 13:41:27.622991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:40.238 [2024-07-15 13:41:27.623030] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bec120 00:18:40.238 [2024-07-15 13:41:27.623039] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:40.238 [2024-07-15 13:41:27.624141] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:40.238 [2024-07-15 13:41:27.624162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:40.238 BaseBdev2 00:18:40.238 13:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:40.238 spare_malloc 00:18:40.238 13:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:40.497 spare_delay 00:18:40.497 13:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:40.756 [2024-07-15 13:41:28.127957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:40.756 [2024-07-15 13:41:28.127993] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:40.756 [2024-07-15 13:41:28.128030] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d9a780 00:18:40.756 [2024-07-15 13:41:28.128039] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:40.756 [2024-07-15 13:41:28.129201] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:40.756 [2024-07-15 13:41:28.129224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:40.756 spare 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:40.756 [2024-07-15 13:41:28.300416] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:40.756 [2024-07-15 13:41:28.301410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:40.756 [2024-07-15 13:41:28.301476] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d9b930 00:18:40.756 [2024-07-15 13:41:28.301485] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:40.756 [2024-07-15 13:41:28.301639] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d94d50 00:18:40.756 [2024-07-15 13:41:28.301741] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d9b930 00:18:40.756 [2024-07-15 13:41:28.301748] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d9b930 00:18:40.756 [2024-07-15 13:41:28.301827] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.756 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:41.015 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.015 "name": "raid_bdev1", 00:18:41.015 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:41.015 "strip_size_kb": 0, 00:18:41.015 "state": "online", 00:18:41.015 "raid_level": "raid1", 00:18:41.015 "superblock": false, 00:18:41.015 "num_base_bdevs": 2, 00:18:41.015 "num_base_bdevs_discovered": 2, 00:18:41.015 "num_base_bdevs_operational": 2, 00:18:41.015 "base_bdevs_list": [ 00:18:41.015 { 00:18:41.015 "name": "BaseBdev1", 00:18:41.015 "uuid": "593e4d9f-411b-5aaa-8347-309287ff303d", 00:18:41.015 "is_configured": true, 00:18:41.015 "data_offset": 0, 00:18:41.015 "data_size": 65536 00:18:41.015 }, 00:18:41.015 { 00:18:41.015 "name": "BaseBdev2", 00:18:41.015 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:41.015 "is_configured": true, 00:18:41.015 "data_offset": 0, 00:18:41.015 "data_size": 65536 00:18:41.015 } 00:18:41.015 ] 00:18:41.015 }' 00:18:41.015 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.015 13:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.580 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:41.580 13:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:41.580 [2024-07-15 13:41:29.126798] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:41.580 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:18:41.580 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.580 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:41.838 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:42.095 [2024-07-15 13:41:29.479567] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d94d50 00:18:42.095 /dev/nbd0 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:42.095 1+0 records in 00:18:42.095 1+0 records out 00:18:42.095 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228857 s, 17.9 MB/s 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:18:42.095 13:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:18:46.271 65536+0 records in 00:18:46.271 65536+0 records out 00:18:46.271 33554432 bytes (34 MB, 32 MiB) copied, 4.06932 s, 8.2 MB/s 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:46.271 [2024-07-15 13:41:33.796132] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:46.271 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:46.529 [2024-07-15 13:41:33.968614] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.529 13:41:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:46.786 13:41:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.786 "name": "raid_bdev1", 00:18:46.786 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:46.786 "strip_size_kb": 0, 00:18:46.786 "state": "online", 00:18:46.786 "raid_level": "raid1", 00:18:46.786 "superblock": false, 00:18:46.786 "num_base_bdevs": 2, 00:18:46.786 "num_base_bdevs_discovered": 1, 00:18:46.786 "num_base_bdevs_operational": 1, 00:18:46.786 "base_bdevs_list": [ 00:18:46.786 { 00:18:46.786 "name": null, 00:18:46.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.786 "is_configured": false, 00:18:46.786 "data_offset": 0, 00:18:46.786 "data_size": 65536 00:18:46.786 }, 00:18:46.786 { 00:18:46.786 "name": "BaseBdev2", 00:18:46.786 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:46.786 "is_configured": true, 00:18:46.786 "data_offset": 0, 00:18:46.786 "data_size": 65536 00:18:46.786 } 00:18:46.786 ] 00:18:46.786 }' 00:18:46.786 13:41:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.786 13:41:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.043 13:41:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:47.300 [2024-07-15 13:41:34.810777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:47.300 [2024-07-15 13:41:34.815213] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d9c140 00:18:47.300 [2024-07-15 13:41:34.816784] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:47.300 13:41:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:48.232 13:41:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:48.232 13:41:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:48.232 13:41:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:48.232 13:41:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:48.232 13:41:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:48.232 13:41:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.232 13:41:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:48.489 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:48.489 "name": "raid_bdev1", 00:18:48.489 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:48.489 "strip_size_kb": 0, 00:18:48.489 "state": "online", 00:18:48.489 "raid_level": "raid1", 00:18:48.489 "superblock": false, 00:18:48.489 "num_base_bdevs": 2, 00:18:48.489 "num_base_bdevs_discovered": 2, 00:18:48.489 "num_base_bdevs_operational": 2, 00:18:48.489 "process": { 00:18:48.489 "type": "rebuild", 00:18:48.489 "target": "spare", 00:18:48.489 "progress": { 00:18:48.489 "blocks": 22528, 00:18:48.489 "percent": 34 00:18:48.489 } 00:18:48.489 }, 00:18:48.489 "base_bdevs_list": [ 00:18:48.489 { 00:18:48.489 "name": "spare", 00:18:48.489 "uuid": "fd48ba65-344d-5b1e-b9f7-119db4ce8573", 00:18:48.489 "is_configured": true, 00:18:48.489 "data_offset": 0, 00:18:48.489 "data_size": 65536 00:18:48.489 }, 00:18:48.489 { 00:18:48.489 "name": "BaseBdev2", 00:18:48.489 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:48.489 "is_configured": true, 00:18:48.489 "data_offset": 0, 00:18:48.489 "data_size": 65536 00:18:48.489 } 00:18:48.489 ] 00:18:48.489 }' 00:18:48.489 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:48.489 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:48.489 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:48.489 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:48.489 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:48.747 [2024-07-15 13:41:36.251741] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:48.747 [2024-07-15 13:41:36.327884] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:48.747 [2024-07-15 13:41:36.327939] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:48.747 [2024-07-15 13:41:36.327950] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:48.747 [2024-07-15 13:41:36.327956] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.747 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:49.004 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.004 "name": "raid_bdev1", 00:18:49.004 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:49.004 "strip_size_kb": 0, 00:18:49.004 "state": "online", 00:18:49.004 "raid_level": "raid1", 00:18:49.004 "superblock": false, 00:18:49.004 "num_base_bdevs": 2, 00:18:49.004 "num_base_bdevs_discovered": 1, 00:18:49.004 "num_base_bdevs_operational": 1, 00:18:49.004 "base_bdevs_list": [ 00:18:49.004 { 00:18:49.004 "name": null, 00:18:49.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.004 "is_configured": false, 00:18:49.004 "data_offset": 0, 00:18:49.004 "data_size": 65536 00:18:49.004 }, 00:18:49.004 { 00:18:49.004 "name": "BaseBdev2", 00:18:49.004 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:49.004 "is_configured": true, 00:18:49.004 "data_offset": 0, 00:18:49.004 "data_size": 65536 00:18:49.004 } 00:18:49.004 ] 00:18:49.004 }' 00:18:49.004 13:41:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.004 13:41:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.569 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:49.569 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:49.569 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:49.569 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:49.569 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:49.569 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:49.569 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.828 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:49.828 "name": "raid_bdev1", 00:18:49.828 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:49.828 "strip_size_kb": 0, 00:18:49.828 "state": "online", 00:18:49.828 "raid_level": "raid1", 00:18:49.828 "superblock": false, 00:18:49.828 "num_base_bdevs": 2, 00:18:49.828 "num_base_bdevs_discovered": 1, 00:18:49.828 "num_base_bdevs_operational": 1, 00:18:49.828 "base_bdevs_list": [ 00:18:49.828 { 00:18:49.828 "name": null, 00:18:49.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.828 "is_configured": false, 00:18:49.828 "data_offset": 0, 00:18:49.828 "data_size": 65536 00:18:49.828 }, 00:18:49.828 { 00:18:49.828 "name": "BaseBdev2", 00:18:49.828 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:49.828 "is_configured": true, 00:18:49.828 "data_offset": 0, 00:18:49.828 "data_size": 65536 00:18:49.828 } 00:18:49.828 ] 00:18:49.828 }' 00:18:49.828 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:49.828 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:49.828 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:49.828 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:49.828 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:49.828 [2024-07-15 13:41:37.431297] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:49.828 [2024-07-15 13:41:37.435700] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d94d50 00:18:49.828 [2024-07-15 13:41:37.436724] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:50.087 13:41:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.023 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:51.023 "name": "raid_bdev1", 00:18:51.023 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:51.023 "strip_size_kb": 0, 00:18:51.023 "state": "online", 00:18:51.023 "raid_level": "raid1", 00:18:51.023 "superblock": false, 00:18:51.023 "num_base_bdevs": 2, 00:18:51.023 "num_base_bdevs_discovered": 2, 00:18:51.023 "num_base_bdevs_operational": 2, 00:18:51.023 "process": { 00:18:51.023 "type": "rebuild", 00:18:51.023 "target": "spare", 00:18:51.023 "progress": { 00:18:51.023 "blocks": 22528, 00:18:51.023 "percent": 34 00:18:51.023 } 00:18:51.023 }, 00:18:51.023 "base_bdevs_list": [ 00:18:51.023 { 00:18:51.023 "name": "spare", 00:18:51.023 "uuid": "fd48ba65-344d-5b1e-b9f7-119db4ce8573", 00:18:51.023 "is_configured": true, 00:18:51.023 "data_offset": 0, 00:18:51.023 "data_size": 65536 00:18:51.023 }, 00:18:51.023 { 00:18:51.023 "name": "BaseBdev2", 00:18:51.023 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:51.023 "is_configured": true, 00:18:51.023 "data_offset": 0, 00:18:51.023 "data_size": 65536 00:18:51.023 } 00:18:51.023 ] 00:18:51.023 }' 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=598 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.284 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.627 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:51.627 "name": "raid_bdev1", 00:18:51.627 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:51.627 "strip_size_kb": 0, 00:18:51.627 "state": "online", 00:18:51.628 "raid_level": "raid1", 00:18:51.628 "superblock": false, 00:18:51.628 "num_base_bdevs": 2, 00:18:51.628 "num_base_bdevs_discovered": 2, 00:18:51.628 "num_base_bdevs_operational": 2, 00:18:51.628 "process": { 00:18:51.628 "type": "rebuild", 00:18:51.628 "target": "spare", 00:18:51.628 "progress": { 00:18:51.628 "blocks": 28672, 00:18:51.628 "percent": 43 00:18:51.628 } 00:18:51.628 }, 00:18:51.628 "base_bdevs_list": [ 00:18:51.628 { 00:18:51.628 "name": "spare", 00:18:51.628 "uuid": "fd48ba65-344d-5b1e-b9f7-119db4ce8573", 00:18:51.628 "is_configured": true, 00:18:51.628 "data_offset": 0, 00:18:51.628 "data_size": 65536 00:18:51.628 }, 00:18:51.628 { 00:18:51.628 "name": "BaseBdev2", 00:18:51.628 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:51.628 "is_configured": true, 00:18:51.628 "data_offset": 0, 00:18:51.628 "data_size": 65536 00:18:51.628 } 00:18:51.628 ] 00:18:51.628 }' 00:18:51.628 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:51.628 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:51.628 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:51.628 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:51.628 13:41:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:52.561 13:41:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:52.561 13:41:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:52.561 13:41:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:52.561 13:41:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:52.561 13:41:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:52.561 13:41:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:52.561 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:52.561 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.561 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:52.561 "name": "raid_bdev1", 00:18:52.561 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:52.561 "strip_size_kb": 0, 00:18:52.561 "state": "online", 00:18:52.561 "raid_level": "raid1", 00:18:52.561 "superblock": false, 00:18:52.561 "num_base_bdevs": 2, 00:18:52.561 "num_base_bdevs_discovered": 2, 00:18:52.561 "num_base_bdevs_operational": 2, 00:18:52.561 "process": { 00:18:52.561 "type": "rebuild", 00:18:52.561 "target": "spare", 00:18:52.561 "progress": { 00:18:52.561 "blocks": 53248, 00:18:52.561 "percent": 81 00:18:52.561 } 00:18:52.561 }, 00:18:52.561 "base_bdevs_list": [ 00:18:52.561 { 00:18:52.561 "name": "spare", 00:18:52.561 "uuid": "fd48ba65-344d-5b1e-b9f7-119db4ce8573", 00:18:52.561 "is_configured": true, 00:18:52.561 "data_offset": 0, 00:18:52.561 "data_size": 65536 00:18:52.561 }, 00:18:52.561 { 00:18:52.561 "name": "BaseBdev2", 00:18:52.561 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:52.561 "is_configured": true, 00:18:52.561 "data_offset": 0, 00:18:52.561 "data_size": 65536 00:18:52.561 } 00:18:52.561 ] 00:18:52.561 }' 00:18:52.561 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:52.819 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:52.819 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:52.819 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:52.819 13:41:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:53.077 [2024-07-15 13:41:40.660030] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:53.077 [2024-07-15 13:41:40.660078] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:53.077 [2024-07-15 13:41:40.660108] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.643 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:53.905 "name": "raid_bdev1", 00:18:53.905 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:53.905 "strip_size_kb": 0, 00:18:53.905 "state": "online", 00:18:53.905 "raid_level": "raid1", 00:18:53.905 "superblock": false, 00:18:53.905 "num_base_bdevs": 2, 00:18:53.905 "num_base_bdevs_discovered": 2, 00:18:53.905 "num_base_bdevs_operational": 2, 00:18:53.905 "base_bdevs_list": [ 00:18:53.905 { 00:18:53.905 "name": "spare", 00:18:53.905 "uuid": "fd48ba65-344d-5b1e-b9f7-119db4ce8573", 00:18:53.905 "is_configured": true, 00:18:53.905 "data_offset": 0, 00:18:53.905 "data_size": 65536 00:18:53.905 }, 00:18:53.905 { 00:18:53.905 "name": "BaseBdev2", 00:18:53.905 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:53.905 "is_configured": true, 00:18:53.905 "data_offset": 0, 00:18:53.905 "data_size": 65536 00:18:53.905 } 00:18:53.905 ] 00:18:53.905 }' 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.905 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:54.162 "name": "raid_bdev1", 00:18:54.162 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:54.162 "strip_size_kb": 0, 00:18:54.162 "state": "online", 00:18:54.162 "raid_level": "raid1", 00:18:54.162 "superblock": false, 00:18:54.162 "num_base_bdevs": 2, 00:18:54.162 "num_base_bdevs_discovered": 2, 00:18:54.162 "num_base_bdevs_operational": 2, 00:18:54.162 "base_bdevs_list": [ 00:18:54.162 { 00:18:54.162 "name": "spare", 00:18:54.162 "uuid": "fd48ba65-344d-5b1e-b9f7-119db4ce8573", 00:18:54.162 "is_configured": true, 00:18:54.162 "data_offset": 0, 00:18:54.162 "data_size": 65536 00:18:54.162 }, 00:18:54.162 { 00:18:54.162 "name": "BaseBdev2", 00:18:54.162 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:54.162 "is_configured": true, 00:18:54.162 "data_offset": 0, 00:18:54.162 "data_size": 65536 00:18:54.162 } 00:18:54.162 ] 00:18:54.162 }' 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.162 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:54.419 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.419 "name": "raid_bdev1", 00:18:54.419 "uuid": "7a3e5d54-bfa8-49a1-bf36-3dbab71b822c", 00:18:54.419 "strip_size_kb": 0, 00:18:54.419 "state": "online", 00:18:54.419 "raid_level": "raid1", 00:18:54.419 "superblock": false, 00:18:54.419 "num_base_bdevs": 2, 00:18:54.419 "num_base_bdevs_discovered": 2, 00:18:54.419 "num_base_bdevs_operational": 2, 00:18:54.419 "base_bdevs_list": [ 00:18:54.419 { 00:18:54.419 "name": "spare", 00:18:54.419 "uuid": "fd48ba65-344d-5b1e-b9f7-119db4ce8573", 00:18:54.419 "is_configured": true, 00:18:54.419 "data_offset": 0, 00:18:54.419 "data_size": 65536 00:18:54.419 }, 00:18:54.419 { 00:18:54.419 "name": "BaseBdev2", 00:18:54.419 "uuid": "8bf9bfa3-6531-55eb-98a1-fba75c632a62", 00:18:54.419 "is_configured": true, 00:18:54.419 "data_offset": 0, 00:18:54.419 "data_size": 65536 00:18:54.419 } 00:18:54.419 ] 00:18:54.419 }' 00:18:54.419 13:41:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.419 13:41:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.985 13:41:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:54.985 [2024-07-15 13:41:42.581515] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:54.985 [2024-07-15 13:41:42.581541] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:54.985 [2024-07-15 13:41:42.581588] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:54.985 [2024-07-15 13:41:42.581628] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:54.985 [2024-07-15 13:41:42.581636] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d9b930 name raid_bdev1, state offline 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:55.243 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:55.502 /dev/nbd0 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:55.502 1+0 records in 00:18:55.502 1+0 records out 00:18:55.502 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002618 s, 15.6 MB/s 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:55.502 13:41:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:55.761 /dev/nbd1 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:55.761 1+0 records in 00:18:55.761 1+0 records out 00:18:55.761 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268562 s, 15.3 MB/s 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:55.761 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:56.020 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 57978 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 57978 ']' 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 57978 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 57978 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 57978' 00:18:56.279 killing process with pid 57978 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 57978 00:18:56.279 Received shutdown signal, test time was about 60.000000 seconds 00:18:56.279 00:18:56.279 Latency(us) 00:18:56.279 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:56.279 =================================================================================================================== 00:18:56.279 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:56.279 [2024-07-15 13:41:43.706270] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:56.279 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 57978 00:18:56.279 [2024-07-15 13:41:43.732707] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:56.538 13:41:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:18:56.538 00:18:56.538 real 0m17.844s 00:18:56.538 user 0m23.448s 00:18:56.538 sys 0m3.946s 00:18:56.538 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:56.538 13:41:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.538 ************************************ 00:18:56.538 END TEST raid_rebuild_test 00:18:56.538 ************************************ 00:18:56.538 13:41:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:56.538 13:41:43 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:18:56.538 13:41:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:18:56.538 13:41:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:56.538 13:41:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:56.538 ************************************ 00:18:56.538 START TEST raid_rebuild_test_sb 00:18:56.538 ************************************ 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=60520 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 60520 /var/tmp/spdk-raid.sock 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 60520 ']' 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:56.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:56.538 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:56.538 [2024-07-15 13:41:44.088567] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:18:56.538 [2024-07-15 13:41:44.088620] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid60520 ] 00:18:56.538 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:56.538 Zero copy mechanism will not be used. 00:18:56.797 [2024-07-15 13:41:44.175267] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:56.797 [2024-07-15 13:41:44.264153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:56.797 [2024-07-15 13:41:44.316527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:56.797 [2024-07-15 13:41:44.316551] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.364 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:57.364 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:57.364 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:57.364 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:57.623 BaseBdev1_malloc 00:18:57.623 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:57.623 [2024-07-15 13:41:45.228670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:57.623 [2024-07-15 13:41:45.228713] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:57.623 [2024-07-15 13:41:45.228728] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110a600 00:18:57.623 [2024-07-15 13:41:45.228736] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:57.623 [2024-07-15 13:41:45.229902] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:57.623 [2024-07-15 13:41:45.229924] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:57.623 BaseBdev1 00:18:57.881 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:57.881 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:57.881 BaseBdev2_malloc 00:18:57.881 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:58.140 [2024-07-15 13:41:45.577457] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:58.140 [2024-07-15 13:41:45.577494] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.140 [2024-07-15 13:41:45.577512] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110b120 00:18:58.140 [2024-07-15 13:41:45.577520] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.140 [2024-07-15 13:41:45.578515] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.140 [2024-07-15 13:41:45.578538] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:58.140 BaseBdev2 00:18:58.140 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:58.399 spare_malloc 00:18:58.399 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:58.399 spare_delay 00:18:58.399 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:58.657 [2024-07-15 13:41:46.115040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:58.657 [2024-07-15 13:41:46.115087] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.657 [2024-07-15 13:41:46.115102] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b9780 00:18:58.657 [2024-07-15 13:41:46.115110] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.657 [2024-07-15 13:41:46.116132] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.657 [2024-07-15 13:41:46.116155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:58.657 spare 00:18:58.657 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:58.923 [2024-07-15 13:41:46.295522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:58.923 [2024-07-15 13:41:46.296309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:58.923 [2024-07-15 13:41:46.296420] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ba930 00:18:58.923 [2024-07-15 13:41:46.296428] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:58.923 [2024-07-15 13:41:46.296550] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b3d50 00:18:58.923 [2024-07-15 13:41:46.296642] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ba930 00:18:58.923 [2024-07-15 13:41:46.296648] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12ba930 00:18:58.923 [2024-07-15 13:41:46.296707] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:58.923 "name": "raid_bdev1", 00:18:58.923 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:18:58.923 "strip_size_kb": 0, 00:18:58.923 "state": "online", 00:18:58.923 "raid_level": "raid1", 00:18:58.923 "superblock": true, 00:18:58.923 "num_base_bdevs": 2, 00:18:58.923 "num_base_bdevs_discovered": 2, 00:18:58.923 "num_base_bdevs_operational": 2, 00:18:58.923 "base_bdevs_list": [ 00:18:58.923 { 00:18:58.923 "name": "BaseBdev1", 00:18:58.923 "uuid": "384aed1f-422b-54ca-964a-f1a10fa42879", 00:18:58.923 "is_configured": true, 00:18:58.923 "data_offset": 2048, 00:18:58.923 "data_size": 63488 00:18:58.923 }, 00:18:58.923 { 00:18:58.923 "name": "BaseBdev2", 00:18:58.923 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:18:58.923 "is_configured": true, 00:18:58.923 "data_offset": 2048, 00:18:58.923 "data_size": 63488 00:18:58.923 } 00:18:58.923 ] 00:18:58.923 }' 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:58.923 13:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:59.490 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:59.490 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:59.749 [2024-07-15 13:41:47.145881] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:59.749 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:00.008 [2024-07-15 13:41:47.498665] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b3d50 00:19:00.008 /dev/nbd0 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:00.008 1+0 records in 00:19:00.008 1+0 records out 00:19:00.008 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239159 s, 17.1 MB/s 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:00.008 13:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:19:04.196 63488+0 records in 00:19:04.196 63488+0 records out 00:19:04.196 32505856 bytes (33 MB, 31 MiB) copied, 3.74992 s, 8.7 MB/s 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:04.196 [2024-07-15 13:41:51.503149] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:04.196 [2024-07-15 13:41:51.683639] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.196 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.197 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.197 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:04.197 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.455 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.455 "name": "raid_bdev1", 00:19:04.455 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:04.455 "strip_size_kb": 0, 00:19:04.455 "state": "online", 00:19:04.455 "raid_level": "raid1", 00:19:04.455 "superblock": true, 00:19:04.455 "num_base_bdevs": 2, 00:19:04.455 "num_base_bdevs_discovered": 1, 00:19:04.455 "num_base_bdevs_operational": 1, 00:19:04.455 "base_bdevs_list": [ 00:19:04.455 { 00:19:04.455 "name": null, 00:19:04.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.455 "is_configured": false, 00:19:04.455 "data_offset": 2048, 00:19:04.455 "data_size": 63488 00:19:04.455 }, 00:19:04.455 { 00:19:04.455 "name": "BaseBdev2", 00:19:04.455 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:04.455 "is_configured": true, 00:19:04.455 "data_offset": 2048, 00:19:04.455 "data_size": 63488 00:19:04.455 } 00:19:04.455 ] 00:19:04.455 }' 00:19:04.455 13:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.455 13:41:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:05.021 13:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:05.021 [2024-07-15 13:41:52.537849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:05.021 [2024-07-15 13:41:52.542373] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ba5a0 00:19:05.021 [2024-07-15 13:41:52.543964] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:05.021 13:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:05.956 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:05.956 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:05.956 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:05.956 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:05.956 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:05.956 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.956 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.214 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:06.214 "name": "raid_bdev1", 00:19:06.214 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:06.214 "strip_size_kb": 0, 00:19:06.214 "state": "online", 00:19:06.214 "raid_level": "raid1", 00:19:06.214 "superblock": true, 00:19:06.214 "num_base_bdevs": 2, 00:19:06.214 "num_base_bdevs_discovered": 2, 00:19:06.214 "num_base_bdevs_operational": 2, 00:19:06.214 "process": { 00:19:06.214 "type": "rebuild", 00:19:06.214 "target": "spare", 00:19:06.214 "progress": { 00:19:06.214 "blocks": 22528, 00:19:06.214 "percent": 35 00:19:06.214 } 00:19:06.214 }, 00:19:06.214 "base_bdevs_list": [ 00:19:06.214 { 00:19:06.214 "name": "spare", 00:19:06.214 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:06.214 "is_configured": true, 00:19:06.214 "data_offset": 2048, 00:19:06.214 "data_size": 63488 00:19:06.214 }, 00:19:06.214 { 00:19:06.214 "name": "BaseBdev2", 00:19:06.214 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:06.214 "is_configured": true, 00:19:06.214 "data_offset": 2048, 00:19:06.214 "data_size": 63488 00:19:06.214 } 00:19:06.214 ] 00:19:06.214 }' 00:19:06.215 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:06.215 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:06.215 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:06.215 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:06.215 13:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:06.473 [2024-07-15 13:41:53.974665] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:06.473 [2024-07-15 13:41:54.055363] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:06.473 [2024-07-15 13:41:54.055397] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:06.473 [2024-07-15 13:41:54.055408] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:06.473 [2024-07-15 13:41:54.055414] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.473 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.732 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.732 "name": "raid_bdev1", 00:19:06.732 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:06.732 "strip_size_kb": 0, 00:19:06.732 "state": "online", 00:19:06.732 "raid_level": "raid1", 00:19:06.732 "superblock": true, 00:19:06.732 "num_base_bdevs": 2, 00:19:06.732 "num_base_bdevs_discovered": 1, 00:19:06.732 "num_base_bdevs_operational": 1, 00:19:06.732 "base_bdevs_list": [ 00:19:06.732 { 00:19:06.732 "name": null, 00:19:06.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.732 "is_configured": false, 00:19:06.732 "data_offset": 2048, 00:19:06.732 "data_size": 63488 00:19:06.732 }, 00:19:06.732 { 00:19:06.732 "name": "BaseBdev2", 00:19:06.732 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:06.732 "is_configured": true, 00:19:06.732 "data_offset": 2048, 00:19:06.732 "data_size": 63488 00:19:06.732 } 00:19:06.732 ] 00:19:06.732 }' 00:19:06.732 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.732 13:41:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:07.299 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:07.299 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:07.299 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:07.299 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:07.299 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:07.299 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.299 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:07.557 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:07.557 "name": "raid_bdev1", 00:19:07.557 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:07.557 "strip_size_kb": 0, 00:19:07.557 "state": "online", 00:19:07.557 "raid_level": "raid1", 00:19:07.557 "superblock": true, 00:19:07.557 "num_base_bdevs": 2, 00:19:07.557 "num_base_bdevs_discovered": 1, 00:19:07.557 "num_base_bdevs_operational": 1, 00:19:07.557 "base_bdevs_list": [ 00:19:07.557 { 00:19:07.557 "name": null, 00:19:07.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.557 "is_configured": false, 00:19:07.557 "data_offset": 2048, 00:19:07.557 "data_size": 63488 00:19:07.557 }, 00:19:07.557 { 00:19:07.557 "name": "BaseBdev2", 00:19:07.557 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:07.557 "is_configured": true, 00:19:07.557 "data_offset": 2048, 00:19:07.557 "data_size": 63488 00:19:07.557 } 00:19:07.557 ] 00:19:07.557 }' 00:19:07.557 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:07.557 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:07.558 13:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:07.558 13:41:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:07.558 13:41:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:07.558 [2024-07-15 13:41:55.167115] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:07.558 [2024-07-15 13:41:55.171583] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ba5a0 00:19:07.558 [2024-07-15 13:41:55.172671] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:07.816 13:41:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:08.751 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:08.751 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:08.751 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:08.751 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:08.751 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:08.751 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.751 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.009 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:09.009 "name": "raid_bdev1", 00:19:09.009 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:09.009 "strip_size_kb": 0, 00:19:09.009 "state": "online", 00:19:09.009 "raid_level": "raid1", 00:19:09.009 "superblock": true, 00:19:09.009 "num_base_bdevs": 2, 00:19:09.009 "num_base_bdevs_discovered": 2, 00:19:09.009 "num_base_bdevs_operational": 2, 00:19:09.009 "process": { 00:19:09.009 "type": "rebuild", 00:19:09.009 "target": "spare", 00:19:09.009 "progress": { 00:19:09.009 "blocks": 22528, 00:19:09.009 "percent": 35 00:19:09.009 } 00:19:09.009 }, 00:19:09.009 "base_bdevs_list": [ 00:19:09.009 { 00:19:09.009 "name": "spare", 00:19:09.009 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:09.009 "is_configured": true, 00:19:09.009 "data_offset": 2048, 00:19:09.009 "data_size": 63488 00:19:09.009 }, 00:19:09.009 { 00:19:09.009 "name": "BaseBdev2", 00:19:09.009 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:09.009 "is_configured": true, 00:19:09.009 "data_offset": 2048, 00:19:09.009 "data_size": 63488 00:19:09.009 } 00:19:09.010 ] 00:19:09.010 }' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:19:09.010 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=616 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:09.010 "name": "raid_bdev1", 00:19:09.010 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:09.010 "strip_size_kb": 0, 00:19:09.010 "state": "online", 00:19:09.010 "raid_level": "raid1", 00:19:09.010 "superblock": true, 00:19:09.010 "num_base_bdevs": 2, 00:19:09.010 "num_base_bdevs_discovered": 2, 00:19:09.010 "num_base_bdevs_operational": 2, 00:19:09.010 "process": { 00:19:09.010 "type": "rebuild", 00:19:09.010 "target": "spare", 00:19:09.010 "progress": { 00:19:09.010 "blocks": 28672, 00:19:09.010 "percent": 45 00:19:09.010 } 00:19:09.010 }, 00:19:09.010 "base_bdevs_list": [ 00:19:09.010 { 00:19:09.010 "name": "spare", 00:19:09.010 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:09.010 "is_configured": true, 00:19:09.010 "data_offset": 2048, 00:19:09.010 "data_size": 63488 00:19:09.010 }, 00:19:09.010 { 00:19:09.010 "name": "BaseBdev2", 00:19:09.010 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:09.010 "is_configured": true, 00:19:09.010 "data_offset": 2048, 00:19:09.010 "data_size": 63488 00:19:09.010 } 00:19:09.010 ] 00:19:09.010 }' 00:19:09.010 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:09.268 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:09.268 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:09.268 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:09.268 13:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.202 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:10.459 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:10.459 "name": "raid_bdev1", 00:19:10.459 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:10.459 "strip_size_kb": 0, 00:19:10.459 "state": "online", 00:19:10.459 "raid_level": "raid1", 00:19:10.459 "superblock": true, 00:19:10.459 "num_base_bdevs": 2, 00:19:10.459 "num_base_bdevs_discovered": 2, 00:19:10.459 "num_base_bdevs_operational": 2, 00:19:10.459 "process": { 00:19:10.459 "type": "rebuild", 00:19:10.459 "target": "spare", 00:19:10.459 "progress": { 00:19:10.459 "blocks": 53248, 00:19:10.459 "percent": 83 00:19:10.459 } 00:19:10.459 }, 00:19:10.459 "base_bdevs_list": [ 00:19:10.459 { 00:19:10.459 "name": "spare", 00:19:10.459 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:10.459 "is_configured": true, 00:19:10.459 "data_offset": 2048, 00:19:10.459 "data_size": 63488 00:19:10.459 }, 00:19:10.459 { 00:19:10.459 "name": "BaseBdev2", 00:19:10.459 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:10.459 "is_configured": true, 00:19:10.459 "data_offset": 2048, 00:19:10.459 "data_size": 63488 00:19:10.459 } 00:19:10.459 ] 00:19:10.459 }' 00:19:10.459 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:10.459 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:10.459 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:10.459 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:10.459 13:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:10.717 [2024-07-15 13:41:58.295389] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:10.717 [2024-07-15 13:41:58.295438] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:10.717 [2024-07-15 13:41:58.295507] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.648 13:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:11.648 "name": "raid_bdev1", 00:19:11.648 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:11.648 "strip_size_kb": 0, 00:19:11.648 "state": "online", 00:19:11.648 "raid_level": "raid1", 00:19:11.648 "superblock": true, 00:19:11.648 "num_base_bdevs": 2, 00:19:11.648 "num_base_bdevs_discovered": 2, 00:19:11.648 "num_base_bdevs_operational": 2, 00:19:11.648 "base_bdevs_list": [ 00:19:11.648 { 00:19:11.648 "name": "spare", 00:19:11.648 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:11.648 "is_configured": true, 00:19:11.648 "data_offset": 2048, 00:19:11.648 "data_size": 63488 00:19:11.648 }, 00:19:11.648 { 00:19:11.648 "name": "BaseBdev2", 00:19:11.648 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:11.648 "is_configured": true, 00:19:11.648 "data_offset": 2048, 00:19:11.648 "data_size": 63488 00:19:11.648 } 00:19:11.648 ] 00:19:11.648 }' 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.648 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:11.906 "name": "raid_bdev1", 00:19:11.906 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:11.906 "strip_size_kb": 0, 00:19:11.906 "state": "online", 00:19:11.906 "raid_level": "raid1", 00:19:11.906 "superblock": true, 00:19:11.906 "num_base_bdevs": 2, 00:19:11.906 "num_base_bdevs_discovered": 2, 00:19:11.906 "num_base_bdevs_operational": 2, 00:19:11.906 "base_bdevs_list": [ 00:19:11.906 { 00:19:11.906 "name": "spare", 00:19:11.906 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:11.906 "is_configured": true, 00:19:11.906 "data_offset": 2048, 00:19:11.906 "data_size": 63488 00:19:11.906 }, 00:19:11.906 { 00:19:11.906 "name": "BaseBdev2", 00:19:11.906 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:11.906 "is_configured": true, 00:19:11.906 "data_offset": 2048, 00:19:11.906 "data_size": 63488 00:19:11.906 } 00:19:11.906 ] 00:19:11.906 }' 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.906 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.163 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:12.163 "name": "raid_bdev1", 00:19:12.163 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:12.163 "strip_size_kb": 0, 00:19:12.163 "state": "online", 00:19:12.163 "raid_level": "raid1", 00:19:12.163 "superblock": true, 00:19:12.163 "num_base_bdevs": 2, 00:19:12.163 "num_base_bdevs_discovered": 2, 00:19:12.163 "num_base_bdevs_operational": 2, 00:19:12.163 "base_bdevs_list": [ 00:19:12.163 { 00:19:12.163 "name": "spare", 00:19:12.163 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:12.163 "is_configured": true, 00:19:12.163 "data_offset": 2048, 00:19:12.163 "data_size": 63488 00:19:12.163 }, 00:19:12.163 { 00:19:12.163 "name": "BaseBdev2", 00:19:12.163 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:12.163 "is_configured": true, 00:19:12.163 "data_offset": 2048, 00:19:12.163 "data_size": 63488 00:19:12.163 } 00:19:12.163 ] 00:19:12.163 }' 00:19:12.163 13:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:12.163 13:41:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:12.728 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:12.728 [2024-07-15 13:42:00.257235] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:12.728 [2024-07-15 13:42:00.257262] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:12.728 [2024-07-15 13:42:00.257315] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:12.728 [2024-07-15 13:42:00.257356] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:12.728 [2024-07-15 13:42:00.257365] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ba930 name raid_bdev1, state offline 00:19:12.728 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.728 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:12.985 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:19:13.243 /dev/nbd0 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:13.243 1+0 records in 00:19:13.243 1+0 records out 00:19:13.243 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000113831 s, 36.0 MB/s 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:19:13.243 /dev/nbd1 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:13.243 1+0 records in 00:19:13.243 1+0 records out 00:19:13.243 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284547 s, 14.4 MB/s 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:19:13.243 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:13.501 13:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:13.501 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:19:13.759 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:14.017 [2024-07-15 13:42:01.597180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:14.017 [2024-07-15 13:42:01.597223] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:14.017 [2024-07-15 13:42:01.597240] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b9dc0 00:19:14.017 [2024-07-15 13:42:01.597249] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:14.017 [2024-07-15 13:42:01.598497] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:14.017 [2024-07-15 13:42:01.598527] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:14.017 [2024-07-15 13:42:01.598596] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:14.017 [2024-07-15 13:42:01.598618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:14.017 [2024-07-15 13:42:01.598695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:14.017 spare 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:14.017 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.276 [2024-07-15 13:42:01.698993] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12bad50 00:19:14.276 [2024-07-15 13:42:01.699013] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:14.276 [2024-07-15 13:42:01.699177] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b3d50 00:19:14.276 [2024-07-15 13:42:01.699301] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12bad50 00:19:14.276 [2024-07-15 13:42:01.699308] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12bad50 00:19:14.276 [2024-07-15 13:42:01.699396] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:14.276 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.276 "name": "raid_bdev1", 00:19:14.276 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:14.276 "strip_size_kb": 0, 00:19:14.276 "state": "online", 00:19:14.276 "raid_level": "raid1", 00:19:14.276 "superblock": true, 00:19:14.276 "num_base_bdevs": 2, 00:19:14.276 "num_base_bdevs_discovered": 2, 00:19:14.276 "num_base_bdevs_operational": 2, 00:19:14.276 "base_bdevs_list": [ 00:19:14.276 { 00:19:14.276 "name": "spare", 00:19:14.276 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:14.276 "is_configured": true, 00:19:14.276 "data_offset": 2048, 00:19:14.276 "data_size": 63488 00:19:14.276 }, 00:19:14.276 { 00:19:14.276 "name": "BaseBdev2", 00:19:14.276 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:14.276 "is_configured": true, 00:19:14.276 "data_offset": 2048, 00:19:14.276 "data_size": 63488 00:19:14.276 } 00:19:14.276 ] 00:19:14.276 }' 00:19:14.276 13:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.276 13:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:14.842 "name": "raid_bdev1", 00:19:14.842 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:14.842 "strip_size_kb": 0, 00:19:14.842 "state": "online", 00:19:14.842 "raid_level": "raid1", 00:19:14.842 "superblock": true, 00:19:14.842 "num_base_bdevs": 2, 00:19:14.842 "num_base_bdevs_discovered": 2, 00:19:14.842 "num_base_bdevs_operational": 2, 00:19:14.842 "base_bdevs_list": [ 00:19:14.842 { 00:19:14.842 "name": "spare", 00:19:14.842 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:14.842 "is_configured": true, 00:19:14.842 "data_offset": 2048, 00:19:14.842 "data_size": 63488 00:19:14.842 }, 00:19:14.842 { 00:19:14.842 "name": "BaseBdev2", 00:19:14.842 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:14.842 "is_configured": true, 00:19:14.842 "data_offset": 2048, 00:19:14.842 "data_size": 63488 00:19:14.842 } 00:19:14.842 ] 00:19:14.842 }' 00:19:14.842 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:15.101 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:15.101 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:15.101 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:15.101 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.101 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:19:15.101 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:19:15.101 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:15.358 [2024-07-15 13:42:02.844463] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.358 13:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.648 13:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.648 "name": "raid_bdev1", 00:19:15.648 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:15.648 "strip_size_kb": 0, 00:19:15.648 "state": "online", 00:19:15.648 "raid_level": "raid1", 00:19:15.648 "superblock": true, 00:19:15.648 "num_base_bdevs": 2, 00:19:15.648 "num_base_bdevs_discovered": 1, 00:19:15.648 "num_base_bdevs_operational": 1, 00:19:15.648 "base_bdevs_list": [ 00:19:15.648 { 00:19:15.648 "name": null, 00:19:15.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.648 "is_configured": false, 00:19:15.648 "data_offset": 2048, 00:19:15.648 "data_size": 63488 00:19:15.648 }, 00:19:15.648 { 00:19:15.648 "name": "BaseBdev2", 00:19:15.648 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:15.648 "is_configured": true, 00:19:15.648 "data_offset": 2048, 00:19:15.648 "data_size": 63488 00:19:15.648 } 00:19:15.648 ] 00:19:15.648 }' 00:19:15.648 13:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.648 13:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.224 13:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:16.224 [2024-07-15 13:42:03.710717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:16.224 [2024-07-15 13:42:03.710846] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:16.224 [2024-07-15 13:42:03.710858] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:16.224 [2024-07-15 13:42:03.710881] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:16.224 [2024-07-15 13:42:03.715274] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xde4fc0 00:19:16.224 [2024-07-15 13:42:03.717079] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:16.224 13:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:19:17.157 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:17.157 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:17.157 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:17.157 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:17.157 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:17.157 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:17.157 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.414 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:17.414 "name": "raid_bdev1", 00:19:17.414 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:17.414 "strip_size_kb": 0, 00:19:17.414 "state": "online", 00:19:17.414 "raid_level": "raid1", 00:19:17.414 "superblock": true, 00:19:17.414 "num_base_bdevs": 2, 00:19:17.414 "num_base_bdevs_discovered": 2, 00:19:17.414 "num_base_bdevs_operational": 2, 00:19:17.414 "process": { 00:19:17.414 "type": "rebuild", 00:19:17.414 "target": "spare", 00:19:17.414 "progress": { 00:19:17.414 "blocks": 22528, 00:19:17.414 "percent": 35 00:19:17.414 } 00:19:17.414 }, 00:19:17.414 "base_bdevs_list": [ 00:19:17.414 { 00:19:17.414 "name": "spare", 00:19:17.414 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:17.414 "is_configured": true, 00:19:17.414 "data_offset": 2048, 00:19:17.414 "data_size": 63488 00:19:17.414 }, 00:19:17.414 { 00:19:17.414 "name": "BaseBdev2", 00:19:17.414 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:17.414 "is_configured": true, 00:19:17.414 "data_offset": 2048, 00:19:17.414 "data_size": 63488 00:19:17.414 } 00:19:17.414 ] 00:19:17.414 }' 00:19:17.414 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:17.414 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:17.414 13:42:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:17.414 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:17.414 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:17.671 [2024-07-15 13:42:05.176031] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:17.671 [2024-07-15 13:42:05.228125] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:17.671 [2024-07-15 13:42:05.228163] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:17.671 [2024-07-15 13:42:05.228189] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:17.671 [2024-07-15 13:42:05.228195] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:17.671 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:17.671 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:17.671 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:17.671 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.672 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:17.929 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.929 "name": "raid_bdev1", 00:19:17.929 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:17.929 "strip_size_kb": 0, 00:19:17.929 "state": "online", 00:19:17.929 "raid_level": "raid1", 00:19:17.929 "superblock": true, 00:19:17.929 "num_base_bdevs": 2, 00:19:17.929 "num_base_bdevs_discovered": 1, 00:19:17.929 "num_base_bdevs_operational": 1, 00:19:17.929 "base_bdevs_list": [ 00:19:17.929 { 00:19:17.929 "name": null, 00:19:17.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.929 "is_configured": false, 00:19:17.929 "data_offset": 2048, 00:19:17.929 "data_size": 63488 00:19:17.929 }, 00:19:17.929 { 00:19:17.929 "name": "BaseBdev2", 00:19:17.929 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:17.929 "is_configured": true, 00:19:17.929 "data_offset": 2048, 00:19:17.929 "data_size": 63488 00:19:17.929 } 00:19:17.929 ] 00:19:17.929 }' 00:19:17.929 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.929 13:42:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.494 13:42:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:18.494 [2024-07-15 13:42:06.058976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:18.494 [2024-07-15 13:42:06.059027] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:18.494 [2024-07-15 13:42:06.059045] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1101350 00:19:18.494 [2024-07-15 13:42:06.059054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:18.494 [2024-07-15 13:42:06.059348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:18.494 [2024-07-15 13:42:06.059363] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:18.494 [2024-07-15 13:42:06.059430] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:18.494 [2024-07-15 13:42:06.059440] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:18.494 [2024-07-15 13:42:06.059448] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:18.494 [2024-07-15 13:42:06.059462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:18.494 [2024-07-15 13:42:06.063880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b9ac0 00:19:18.494 spare 00:19:18.494 [2024-07-15 13:42:06.064963] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:18.494 13:42:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:19.867 "name": "raid_bdev1", 00:19:19.867 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:19.867 "strip_size_kb": 0, 00:19:19.867 "state": "online", 00:19:19.867 "raid_level": "raid1", 00:19:19.867 "superblock": true, 00:19:19.867 "num_base_bdevs": 2, 00:19:19.867 "num_base_bdevs_discovered": 2, 00:19:19.867 "num_base_bdevs_operational": 2, 00:19:19.867 "process": { 00:19:19.867 "type": "rebuild", 00:19:19.867 "target": "spare", 00:19:19.867 "progress": { 00:19:19.867 "blocks": 22528, 00:19:19.867 "percent": 35 00:19:19.867 } 00:19:19.867 }, 00:19:19.867 "base_bdevs_list": [ 00:19:19.867 { 00:19:19.867 "name": "spare", 00:19:19.867 "uuid": "2c15067a-5834-5362-97f8-b93ddca852d1", 00:19:19.867 "is_configured": true, 00:19:19.867 "data_offset": 2048, 00:19:19.867 "data_size": 63488 00:19:19.867 }, 00:19:19.867 { 00:19:19.867 "name": "BaseBdev2", 00:19:19.867 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:19.867 "is_configured": true, 00:19:19.867 "data_offset": 2048, 00:19:19.867 "data_size": 63488 00:19:19.867 } 00:19:19.867 ] 00:19:19.867 }' 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:19.867 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:20.125 [2024-07-15 13:42:07.507738] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:20.125 [2024-07-15 13:42:07.576373] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:20.125 [2024-07-15 13:42:07.576405] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:20.125 [2024-07-15 13:42:07.576415] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:20.125 [2024-07-15 13:42:07.576420] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.125 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:20.383 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.383 "name": "raid_bdev1", 00:19:20.383 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:20.383 "strip_size_kb": 0, 00:19:20.383 "state": "online", 00:19:20.383 "raid_level": "raid1", 00:19:20.383 "superblock": true, 00:19:20.383 "num_base_bdevs": 2, 00:19:20.383 "num_base_bdevs_discovered": 1, 00:19:20.383 "num_base_bdevs_operational": 1, 00:19:20.383 "base_bdevs_list": [ 00:19:20.383 { 00:19:20.383 "name": null, 00:19:20.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.383 "is_configured": false, 00:19:20.383 "data_offset": 2048, 00:19:20.383 "data_size": 63488 00:19:20.383 }, 00:19:20.383 { 00:19:20.383 "name": "BaseBdev2", 00:19:20.383 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:20.383 "is_configured": true, 00:19:20.383 "data_offset": 2048, 00:19:20.383 "data_size": 63488 00:19:20.383 } 00:19:20.384 ] 00:19:20.384 }' 00:19:20.384 13:42:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.384 13:42:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:20.949 "name": "raid_bdev1", 00:19:20.949 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:20.949 "strip_size_kb": 0, 00:19:20.949 "state": "online", 00:19:20.949 "raid_level": "raid1", 00:19:20.949 "superblock": true, 00:19:20.949 "num_base_bdevs": 2, 00:19:20.949 "num_base_bdevs_discovered": 1, 00:19:20.949 "num_base_bdevs_operational": 1, 00:19:20.949 "base_bdevs_list": [ 00:19:20.949 { 00:19:20.949 "name": null, 00:19:20.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.949 "is_configured": false, 00:19:20.949 "data_offset": 2048, 00:19:20.949 "data_size": 63488 00:19:20.949 }, 00:19:20.949 { 00:19:20.949 "name": "BaseBdev2", 00:19:20.949 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:20.949 "is_configured": true, 00:19:20.949 "data_offset": 2048, 00:19:20.949 "data_size": 63488 00:19:20.949 } 00:19:20.949 ] 00:19:20.949 }' 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:20.949 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:19:21.207 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:21.464 [2024-07-15 13:42:08.876285] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:21.464 [2024-07-15 13:42:08.876327] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:21.464 [2024-07-15 13:42:08.876343] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b9ff0 00:19:21.464 [2024-07-15 13:42:08.876351] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:21.464 [2024-07-15 13:42:08.876629] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:21.464 [2024-07-15 13:42:08.876642] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:21.464 [2024-07-15 13:42:08.876693] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:19:21.464 [2024-07-15 13:42:08.876703] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:21.464 [2024-07-15 13:42:08.876710] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:21.464 BaseBdev1 00:19:21.464 13:42:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.399 13:42:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:22.656 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.656 "name": "raid_bdev1", 00:19:22.656 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:22.656 "strip_size_kb": 0, 00:19:22.656 "state": "online", 00:19:22.656 "raid_level": "raid1", 00:19:22.656 "superblock": true, 00:19:22.656 "num_base_bdevs": 2, 00:19:22.656 "num_base_bdevs_discovered": 1, 00:19:22.656 "num_base_bdevs_operational": 1, 00:19:22.656 "base_bdevs_list": [ 00:19:22.656 { 00:19:22.656 "name": null, 00:19:22.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.656 "is_configured": false, 00:19:22.656 "data_offset": 2048, 00:19:22.656 "data_size": 63488 00:19:22.656 }, 00:19:22.656 { 00:19:22.656 "name": "BaseBdev2", 00:19:22.656 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:22.656 "is_configured": true, 00:19:22.656 "data_offset": 2048, 00:19:22.656 "data_size": 63488 00:19:22.656 } 00:19:22.656 ] 00:19:22.656 }' 00:19:22.656 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.656 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:23.221 "name": "raid_bdev1", 00:19:23.221 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:23.221 "strip_size_kb": 0, 00:19:23.221 "state": "online", 00:19:23.221 "raid_level": "raid1", 00:19:23.221 "superblock": true, 00:19:23.221 "num_base_bdevs": 2, 00:19:23.221 "num_base_bdevs_discovered": 1, 00:19:23.221 "num_base_bdevs_operational": 1, 00:19:23.221 "base_bdevs_list": [ 00:19:23.221 { 00:19:23.221 "name": null, 00:19:23.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.221 "is_configured": false, 00:19:23.221 "data_offset": 2048, 00:19:23.221 "data_size": 63488 00:19:23.221 }, 00:19:23.221 { 00:19:23.221 "name": "BaseBdev2", 00:19:23.221 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:23.221 "is_configured": true, 00:19:23.221 "data_offset": 2048, 00:19:23.221 "data_size": 63488 00:19:23.221 } 00:19:23.221 ] 00:19:23.221 }' 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:23.221 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:23.478 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:23.478 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:23.478 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:23.478 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:23.478 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:23.478 13:42:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:23.478 [2024-07-15 13:42:11.001777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:23.478 [2024-07-15 13:42:11.001886] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:23.478 [2024-07-15 13:42:11.001897] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:23.478 request: 00:19:23.478 { 00:19:23.478 "base_bdev": "BaseBdev1", 00:19:23.478 "raid_bdev": "raid_bdev1", 00:19:23.478 "method": "bdev_raid_add_base_bdev", 00:19:23.478 "req_id": 1 00:19:23.478 } 00:19:23.478 Got JSON-RPC error response 00:19:23.478 response: 00:19:23.478 { 00:19:23.478 "code": -22, 00:19:23.478 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:19:23.478 } 00:19:23.478 13:42:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:19:23.478 13:42:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:23.478 13:42:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:23.478 13:42:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:23.478 13:42:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.845 "name": "raid_bdev1", 00:19:24.845 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:24.845 "strip_size_kb": 0, 00:19:24.845 "state": "online", 00:19:24.845 "raid_level": "raid1", 00:19:24.845 "superblock": true, 00:19:24.845 "num_base_bdevs": 2, 00:19:24.845 "num_base_bdevs_discovered": 1, 00:19:24.845 "num_base_bdevs_operational": 1, 00:19:24.845 "base_bdevs_list": [ 00:19:24.845 { 00:19:24.845 "name": null, 00:19:24.845 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.845 "is_configured": false, 00:19:24.845 "data_offset": 2048, 00:19:24.845 "data_size": 63488 00:19:24.845 }, 00:19:24.845 { 00:19:24.845 "name": "BaseBdev2", 00:19:24.845 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:24.845 "is_configured": true, 00:19:24.845 "data_offset": 2048, 00:19:24.845 "data_size": 63488 00:19:24.845 } 00:19:24.845 ] 00:19:24.845 }' 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.845 13:42:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:25.101 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:25.101 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:25.101 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:25.102 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:25.102 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:25.102 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:25.102 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:25.358 "name": "raid_bdev1", 00:19:25.358 "uuid": "f9f8929d-05b2-4d4b-8308-eedd1246e83a", 00:19:25.358 "strip_size_kb": 0, 00:19:25.358 "state": "online", 00:19:25.358 "raid_level": "raid1", 00:19:25.358 "superblock": true, 00:19:25.358 "num_base_bdevs": 2, 00:19:25.358 "num_base_bdevs_discovered": 1, 00:19:25.358 "num_base_bdevs_operational": 1, 00:19:25.358 "base_bdevs_list": [ 00:19:25.358 { 00:19:25.358 "name": null, 00:19:25.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.358 "is_configured": false, 00:19:25.358 "data_offset": 2048, 00:19:25.358 "data_size": 63488 00:19:25.358 }, 00:19:25.358 { 00:19:25.358 "name": "BaseBdev2", 00:19:25.358 "uuid": "6d78eaf8-d8c9-5784-906f-8ba9633e74dc", 00:19:25.358 "is_configured": true, 00:19:25.358 "data_offset": 2048, 00:19:25.358 "data_size": 63488 00:19:25.358 } 00:19:25.358 ] 00:19:25.358 }' 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 60520 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 60520 ']' 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 60520 00:19:25.358 13:42:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:25.615 13:42:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:25.615 13:42:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 60520 00:19:25.615 13:42:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:25.615 13:42:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:25.615 13:42:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 60520' 00:19:25.615 killing process with pid 60520 00:19:25.615 13:42:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 60520 00:19:25.615 Received shutdown signal, test time was about 60.000000 seconds 00:19:25.615 00:19:25.615 Latency(us) 00:19:25.615 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:25.615 =================================================================================================================== 00:19:25.615 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:19:25.615 [2024-07-15 13:42:13.014313] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:25.615 [2024-07-15 13:42:13.014388] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:25.615 [2024-07-15 13:42:13.014420] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:25.615 [2024-07-15 13:42:13.014428] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12bad50 name raid_bdev1, state offline 00:19:25.615 13:42:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 60520 00:19:25.615 [2024-07-15 13:42:13.040228] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:19:25.871 00:19:25.871 real 0m29.216s 00:19:25.871 user 0m41.460s 00:19:25.871 sys 0m5.359s 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:25.871 ************************************ 00:19:25.871 END TEST raid_rebuild_test_sb 00:19:25.871 ************************************ 00:19:25.871 13:42:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:25.871 13:42:13 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:19:25.871 13:42:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:25.871 13:42:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:25.871 13:42:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:25.871 ************************************ 00:19:25.871 START TEST raid_rebuild_test_io 00:19:25.871 ************************************ 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=65243 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 65243 /var/tmp/spdk-raid.sock 00:19:25.871 13:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:25.872 13:42:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 65243 ']' 00:19:25.872 13:42:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:25.872 13:42:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:25.872 13:42:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:25.872 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:25.872 13:42:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:25.872 13:42:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:25.872 [2024-07-15 13:42:13.363752] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:19:25.872 [2024-07-15 13:42:13.363801] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid65243 ] 00:19:25.872 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:25.872 Zero copy mechanism will not be used. 00:19:25.872 [2024-07-15 13:42:13.451541] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:26.127 [2024-07-15 13:42:13.545015] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:26.127 [2024-07-15 13:42:13.602332] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:26.127 [2024-07-15 13:42:13.602357] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:26.689 13:42:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:26.689 13:42:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:19:26.689 13:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:26.689 13:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:26.946 BaseBdev1_malloc 00:19:26.946 13:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:26.946 [2024-07-15 13:42:14.507853] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:26.946 [2024-07-15 13:42:14.507890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:26.946 [2024-07-15 13:42:14.507907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc31600 00:19:26.946 [2024-07-15 13:42:14.507915] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:26.946 [2024-07-15 13:42:14.509126] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:26.946 [2024-07-15 13:42:14.509150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:26.946 BaseBdev1 00:19:26.946 13:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:26.946 13:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:27.202 BaseBdev2_malloc 00:19:27.202 13:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:27.459 [2024-07-15 13:42:14.836654] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:27.459 [2024-07-15 13:42:14.836690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:27.459 [2024-07-15 13:42:14.836707] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc32120 00:19:27.459 [2024-07-15 13:42:14.836720] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:27.459 [2024-07-15 13:42:14.837815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:27.459 [2024-07-15 13:42:14.837838] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:27.459 BaseBdev2 00:19:27.459 13:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:27.459 spare_malloc 00:19:27.459 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:27.716 spare_delay 00:19:27.716 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:27.973 [2024-07-15 13:42:15.349669] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:27.973 [2024-07-15 13:42:15.349707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:27.973 [2024-07-15 13:42:15.349723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde0780 00:19:27.973 [2024-07-15 13:42:15.349731] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:27.973 [2024-07-15 13:42:15.350938] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:27.973 [2024-07-15 13:42:15.350962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:27.973 spare 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:27.973 [2024-07-15 13:42:15.510094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:27.973 [2024-07-15 13:42:15.511071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:27.973 [2024-07-15 13:42:15.511130] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde1930 00:19:27.973 [2024-07-15 13:42:15.511138] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:27.973 [2024-07-15 13:42:15.511289] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xddad50 00:19:27.973 [2024-07-15 13:42:15.511394] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde1930 00:19:27.973 [2024-07-15 13:42:15.511400] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde1930 00:19:27.973 [2024-07-15 13:42:15.511480] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.973 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.230 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.230 "name": "raid_bdev1", 00:19:28.230 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:28.230 "strip_size_kb": 0, 00:19:28.230 "state": "online", 00:19:28.230 "raid_level": "raid1", 00:19:28.230 "superblock": false, 00:19:28.230 "num_base_bdevs": 2, 00:19:28.230 "num_base_bdevs_discovered": 2, 00:19:28.230 "num_base_bdevs_operational": 2, 00:19:28.230 "base_bdevs_list": [ 00:19:28.230 { 00:19:28.230 "name": "BaseBdev1", 00:19:28.230 "uuid": "f1911916-df47-5249-9840-ff9b64b69496", 00:19:28.230 "is_configured": true, 00:19:28.230 "data_offset": 0, 00:19:28.230 "data_size": 65536 00:19:28.230 }, 00:19:28.230 { 00:19:28.230 "name": "BaseBdev2", 00:19:28.230 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:28.230 "is_configured": true, 00:19:28.230 "data_offset": 0, 00:19:28.230 "data_size": 65536 00:19:28.230 } 00:19:28.230 ] 00:19:28.230 }' 00:19:28.230 13:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.230 13:42:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:28.792 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:28.792 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:28.792 [2024-07-15 13:42:16.376487] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:28.792 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:19:28.792 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:28.792 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.049 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:19:29.049 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:19:29.049 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:29.049 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:29.049 [2024-07-15 13:42:16.654946] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xddc490 00:19:29.049 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:29.049 Zero copy mechanism will not be used. 00:19:29.049 Running I/O for 60 seconds... 00:19:29.305 [2024-07-15 13:42:16.746051] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:29.305 [2024-07-15 13:42:16.751458] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xddc490 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:29.305 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.562 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.562 "name": "raid_bdev1", 00:19:29.562 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:29.562 "strip_size_kb": 0, 00:19:29.562 "state": "online", 00:19:29.562 "raid_level": "raid1", 00:19:29.562 "superblock": false, 00:19:29.562 "num_base_bdevs": 2, 00:19:29.562 "num_base_bdevs_discovered": 1, 00:19:29.562 "num_base_bdevs_operational": 1, 00:19:29.562 "base_bdevs_list": [ 00:19:29.562 { 00:19:29.562 "name": null, 00:19:29.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.562 "is_configured": false, 00:19:29.562 "data_offset": 0, 00:19:29.562 "data_size": 65536 00:19:29.562 }, 00:19:29.562 { 00:19:29.562 "name": "BaseBdev2", 00:19:29.562 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:29.562 "is_configured": true, 00:19:29.562 "data_offset": 0, 00:19:29.562 "data_size": 65536 00:19:29.562 } 00:19:29.562 ] 00:19:29.562 }' 00:19:29.562 13:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.562 13:42:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:30.124 13:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:30.124 [2024-07-15 13:42:17.622270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:30.124 [2024-07-15 13:42:17.653629] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd64170 00:19:30.124 [2024-07-15 13:42:17.655586] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:30.124 13:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:30.381 [2024-07-15 13:42:17.770032] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:30.381 [2024-07-15 13:42:17.770440] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:30.637 [2024-07-15 13:42:18.000792] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:30.637 [2024-07-15 13:42:18.001064] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:30.637 [2024-07-15 13:42:18.227604] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:30.894 [2024-07-15 13:42:18.335764] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:31.151 [2024-07-15 13:42:18.655472] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:31.151 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:31.151 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:31.151 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:31.151 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:31.151 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:31.151 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.151 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:31.407 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:31.407 "name": "raid_bdev1", 00:19:31.407 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:31.407 "strip_size_kb": 0, 00:19:31.407 "state": "online", 00:19:31.407 "raid_level": "raid1", 00:19:31.407 "superblock": false, 00:19:31.407 "num_base_bdevs": 2, 00:19:31.407 "num_base_bdevs_discovered": 2, 00:19:31.407 "num_base_bdevs_operational": 2, 00:19:31.407 "process": { 00:19:31.407 "type": "rebuild", 00:19:31.407 "target": "spare", 00:19:31.407 "progress": { 00:19:31.407 "blocks": 14336, 00:19:31.407 "percent": 21 00:19:31.407 } 00:19:31.407 }, 00:19:31.407 "base_bdevs_list": [ 00:19:31.407 { 00:19:31.407 "name": "spare", 00:19:31.407 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:31.407 "is_configured": true, 00:19:31.407 "data_offset": 0, 00:19:31.407 "data_size": 65536 00:19:31.407 }, 00:19:31.407 { 00:19:31.407 "name": "BaseBdev2", 00:19:31.407 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:31.407 "is_configured": true, 00:19:31.407 "data_offset": 0, 00:19:31.407 "data_size": 65536 00:19:31.407 } 00:19:31.407 ] 00:19:31.407 }' 00:19:31.407 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:31.407 [2024-07-15 13:42:18.869490] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:31.407 [2024-07-15 13:42:18.869767] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:31.407 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:31.407 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:31.407 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:31.407 13:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:31.664 [2024-07-15 13:42:19.105824] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:31.920 [2024-07-15 13:42:19.322854] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:31.921 [2024-07-15 13:42:19.329754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:31.921 [2024-07-15 13:42:19.329778] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:31.921 [2024-07-15 13:42:19.329785] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:31.921 [2024-07-15 13:42:19.346149] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xddc490 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.921 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.177 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.177 "name": "raid_bdev1", 00:19:32.177 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:32.177 "strip_size_kb": 0, 00:19:32.177 "state": "online", 00:19:32.177 "raid_level": "raid1", 00:19:32.177 "superblock": false, 00:19:32.177 "num_base_bdevs": 2, 00:19:32.177 "num_base_bdevs_discovered": 1, 00:19:32.177 "num_base_bdevs_operational": 1, 00:19:32.177 "base_bdevs_list": [ 00:19:32.177 { 00:19:32.177 "name": null, 00:19:32.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.177 "is_configured": false, 00:19:32.177 "data_offset": 0, 00:19:32.177 "data_size": 65536 00:19:32.177 }, 00:19:32.177 { 00:19:32.177 "name": "BaseBdev2", 00:19:32.177 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:32.177 "is_configured": true, 00:19:32.177 "data_offset": 0, 00:19:32.177 "data_size": 65536 00:19:32.177 } 00:19:32.177 ] 00:19:32.177 }' 00:19:32.177 13:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.177 13:42:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:32.741 "name": "raid_bdev1", 00:19:32.741 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:32.741 "strip_size_kb": 0, 00:19:32.741 "state": "online", 00:19:32.741 "raid_level": "raid1", 00:19:32.741 "superblock": false, 00:19:32.741 "num_base_bdevs": 2, 00:19:32.741 "num_base_bdevs_discovered": 1, 00:19:32.741 "num_base_bdevs_operational": 1, 00:19:32.741 "base_bdevs_list": [ 00:19:32.741 { 00:19:32.741 "name": null, 00:19:32.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.741 "is_configured": false, 00:19:32.741 "data_offset": 0, 00:19:32.741 "data_size": 65536 00:19:32.741 }, 00:19:32.741 { 00:19:32.741 "name": "BaseBdev2", 00:19:32.741 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:32.741 "is_configured": true, 00:19:32.741 "data_offset": 0, 00:19:32.741 "data_size": 65536 00:19:32.741 } 00:19:32.741 ] 00:19:32.741 }' 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:32.741 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:32.998 [2024-07-15 13:42:20.492237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:32.998 [2024-07-15 13:42:20.517190] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xde1d10 00:19:32.998 [2024-07-15 13:42:20.518337] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:32.998 13:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:33.255 [2024-07-15 13:42:20.632812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:33.255 [2024-07-15 13:42:20.633243] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:33.255 [2024-07-15 13:42:20.836164] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:33.255 [2024-07-15 13:42:20.836382] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:33.816 [2024-07-15 13:42:21.169398] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:33.816 [2024-07-15 13:42:21.270750] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:33.816 [2024-07-15 13:42:21.271024] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:34.073 [2024-07-15 13:42:21.491975] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:34.073 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:34.073 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:34.073 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:34.073 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:34.073 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:34.073 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.073 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.073 [2024-07-15 13:42:21.617106] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:34.073 [2024-07-15 13:42:21.617348] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:34.329 "name": "raid_bdev1", 00:19:34.329 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:34.329 "strip_size_kb": 0, 00:19:34.329 "state": "online", 00:19:34.329 "raid_level": "raid1", 00:19:34.329 "superblock": false, 00:19:34.329 "num_base_bdevs": 2, 00:19:34.329 "num_base_bdevs_discovered": 2, 00:19:34.329 "num_base_bdevs_operational": 2, 00:19:34.329 "process": { 00:19:34.329 "type": "rebuild", 00:19:34.329 "target": "spare", 00:19:34.329 "progress": { 00:19:34.329 "blocks": 16384, 00:19:34.329 "percent": 25 00:19:34.329 } 00:19:34.329 }, 00:19:34.329 "base_bdevs_list": [ 00:19:34.329 { 00:19:34.329 "name": "spare", 00:19:34.329 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:34.329 "is_configured": true, 00:19:34.329 "data_offset": 0, 00:19:34.329 "data_size": 65536 00:19:34.329 }, 00:19:34.329 { 00:19:34.329 "name": "BaseBdev2", 00:19:34.329 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:34.329 "is_configured": true, 00:19:34.329 "data_offset": 0, 00:19:34.329 "data_size": 65536 00:19:34.329 } 00:19:34.329 ] 00:19:34.329 }' 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=641 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.329 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.329 [2024-07-15 13:42:21.881795] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:19:34.586 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:34.586 "name": "raid_bdev1", 00:19:34.586 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:34.586 "strip_size_kb": 0, 00:19:34.586 "state": "online", 00:19:34.586 "raid_level": "raid1", 00:19:34.586 "superblock": false, 00:19:34.586 "num_base_bdevs": 2, 00:19:34.586 "num_base_bdevs_discovered": 2, 00:19:34.586 "num_base_bdevs_operational": 2, 00:19:34.586 "process": { 00:19:34.586 "type": "rebuild", 00:19:34.586 "target": "spare", 00:19:34.586 "progress": { 00:19:34.586 "blocks": 20480, 00:19:34.586 "percent": 31 00:19:34.586 } 00:19:34.586 }, 00:19:34.586 "base_bdevs_list": [ 00:19:34.586 { 00:19:34.586 "name": "spare", 00:19:34.586 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:34.586 "is_configured": true, 00:19:34.586 "data_offset": 0, 00:19:34.586 "data_size": 65536 00:19:34.586 }, 00:19:34.586 { 00:19:34.586 "name": "BaseBdev2", 00:19:34.586 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:34.586 "is_configured": true, 00:19:34.586 "data_offset": 0, 00:19:34.586 "data_size": 65536 00:19:34.586 } 00:19:34.586 ] 00:19:34.586 }' 00:19:34.586 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:34.586 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:34.586 13:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:34.586 13:42:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:34.586 13:42:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:34.842 [2024-07-15 13:42:22.322910] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:19:34.842 [2024-07-15 13:42:22.323327] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:19:35.406 [2024-07-15 13:42:22.887893] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:19:35.663 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:35.663 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:35.663 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:35.663 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:35.663 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:35.663 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:35.664 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.664 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:35.664 [2024-07-15 13:42:23.224440] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:19:35.664 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:35.664 "name": "raid_bdev1", 00:19:35.664 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:35.664 "strip_size_kb": 0, 00:19:35.664 "state": "online", 00:19:35.664 "raid_level": "raid1", 00:19:35.664 "superblock": false, 00:19:35.664 "num_base_bdevs": 2, 00:19:35.664 "num_base_bdevs_discovered": 2, 00:19:35.664 "num_base_bdevs_operational": 2, 00:19:35.664 "process": { 00:19:35.664 "type": "rebuild", 00:19:35.664 "target": "spare", 00:19:35.664 "progress": { 00:19:35.664 "blocks": 36864, 00:19:35.664 "percent": 56 00:19:35.664 } 00:19:35.664 }, 00:19:35.664 "base_bdevs_list": [ 00:19:35.664 { 00:19:35.664 "name": "spare", 00:19:35.664 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:35.664 "is_configured": true, 00:19:35.664 "data_offset": 0, 00:19:35.664 "data_size": 65536 00:19:35.664 }, 00:19:35.664 { 00:19:35.664 "name": "BaseBdev2", 00:19:35.664 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:35.664 "is_configured": true, 00:19:35.664 "data_offset": 0, 00:19:35.664 "data_size": 65536 00:19:35.664 } 00:19:35.664 ] 00:19:35.664 }' 00:19:35.664 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:35.664 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:35.664 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:35.919 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:35.919 13:42:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:36.175 [2024-07-15 13:42:23.669456] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.739 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.036 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:37.036 "name": "raid_bdev1", 00:19:37.036 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:37.036 "strip_size_kb": 0, 00:19:37.036 "state": "online", 00:19:37.036 "raid_level": "raid1", 00:19:37.036 "superblock": false, 00:19:37.036 "num_base_bdevs": 2, 00:19:37.036 "num_base_bdevs_discovered": 2, 00:19:37.036 "num_base_bdevs_operational": 2, 00:19:37.036 "process": { 00:19:37.036 "type": "rebuild", 00:19:37.036 "target": "spare", 00:19:37.036 "progress": { 00:19:37.036 "blocks": 59392, 00:19:37.036 "percent": 90 00:19:37.036 } 00:19:37.036 }, 00:19:37.036 "base_bdevs_list": [ 00:19:37.036 { 00:19:37.036 "name": "spare", 00:19:37.036 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:37.036 "is_configured": true, 00:19:37.036 "data_offset": 0, 00:19:37.036 "data_size": 65536 00:19:37.036 }, 00:19:37.036 { 00:19:37.036 "name": "BaseBdev2", 00:19:37.036 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:37.036 "is_configured": true, 00:19:37.036 "data_offset": 0, 00:19:37.036 "data_size": 65536 00:19:37.036 } 00:19:37.036 ] 00:19:37.036 }' 00:19:37.036 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:37.036 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:37.036 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:37.036 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:37.036 13:42:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:37.344 [2024-07-15 13:42:24.746279] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:37.344 [2024-07-15 13:42:24.851490] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:37.344 [2024-07-15 13:42:24.852699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:38.277 "name": "raid_bdev1", 00:19:38.277 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:38.277 "strip_size_kb": 0, 00:19:38.277 "state": "online", 00:19:38.277 "raid_level": "raid1", 00:19:38.277 "superblock": false, 00:19:38.277 "num_base_bdevs": 2, 00:19:38.277 "num_base_bdevs_discovered": 2, 00:19:38.277 "num_base_bdevs_operational": 2, 00:19:38.277 "base_bdevs_list": [ 00:19:38.277 { 00:19:38.277 "name": "spare", 00:19:38.277 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:38.277 "is_configured": true, 00:19:38.277 "data_offset": 0, 00:19:38.277 "data_size": 65536 00:19:38.277 }, 00:19:38.277 { 00:19:38.277 "name": "BaseBdev2", 00:19:38.277 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:38.277 "is_configured": true, 00:19:38.277 "data_offset": 0, 00:19:38.277 "data_size": 65536 00:19:38.277 } 00:19:38.277 ] 00:19:38.277 }' 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:19:38.277 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:38.278 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:38.278 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:38.278 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:38.278 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:38.278 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.278 13:42:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:38.536 "name": "raid_bdev1", 00:19:38.536 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:38.536 "strip_size_kb": 0, 00:19:38.536 "state": "online", 00:19:38.536 "raid_level": "raid1", 00:19:38.536 "superblock": false, 00:19:38.536 "num_base_bdevs": 2, 00:19:38.536 "num_base_bdevs_discovered": 2, 00:19:38.536 "num_base_bdevs_operational": 2, 00:19:38.536 "base_bdevs_list": [ 00:19:38.536 { 00:19:38.536 "name": "spare", 00:19:38.536 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:38.536 "is_configured": true, 00:19:38.536 "data_offset": 0, 00:19:38.536 "data_size": 65536 00:19:38.536 }, 00:19:38.536 { 00:19:38.536 "name": "BaseBdev2", 00:19:38.536 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:38.536 "is_configured": true, 00:19:38.536 "data_offset": 0, 00:19:38.536 "data_size": 65536 00:19:38.536 } 00:19:38.536 ] 00:19:38.536 }' 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.536 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.537 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.795 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.795 "name": "raid_bdev1", 00:19:38.795 "uuid": "d9d5bafb-c3f3-4bea-933d-733fa333caa1", 00:19:38.795 "strip_size_kb": 0, 00:19:38.795 "state": "online", 00:19:38.795 "raid_level": "raid1", 00:19:38.795 "superblock": false, 00:19:38.795 "num_base_bdevs": 2, 00:19:38.795 "num_base_bdevs_discovered": 2, 00:19:38.795 "num_base_bdevs_operational": 2, 00:19:38.795 "base_bdevs_list": [ 00:19:38.795 { 00:19:38.795 "name": "spare", 00:19:38.795 "uuid": "d4916651-cba5-5614-b683-6437662d593d", 00:19:38.795 "is_configured": true, 00:19:38.795 "data_offset": 0, 00:19:38.795 "data_size": 65536 00:19:38.795 }, 00:19:38.795 { 00:19:38.795 "name": "BaseBdev2", 00:19:38.795 "uuid": "abcfa42f-dc3a-5d10-b8f7-f27646ecd35f", 00:19:38.795 "is_configured": true, 00:19:38.795 "data_offset": 0, 00:19:38.795 "data_size": 65536 00:19:38.795 } 00:19:38.795 ] 00:19:38.795 }' 00:19:38.795 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.795 13:42:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:39.359 13:42:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:39.359 [2024-07-15 13:42:26.911142] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:39.360 [2024-07-15 13:42:26.911170] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:39.617 00:19:39.617 Latency(us) 00:19:39.617 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:39.617 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:39.617 raid_bdev1 : 10.31 113.35 340.04 0.00 0.00 12527.86 245.76 113063.85 00:19:39.617 =================================================================================================================== 00:19:39.617 Total : 113.35 340.04 0.00 0.00 12527.86 245.76 113063.85 00:19:39.617 [2024-07-15 13:42:26.998151] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:39.617 [2024-07-15 13:42:26.998172] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:39.617 [2024-07-15 13:42:26.998221] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:39.617 [2024-07-15 13:42:26.998230] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde1930 name raid_bdev1, state offline 00:19:39.617 0 00:19:39.617 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.617 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:19:39.617 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:39.617 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:39.617 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:39.618 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:39.876 /dev/nbd0 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:39.876 1+0 records in 00:19:39.876 1+0 records out 00:19:39.876 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266052 s, 15.4 MB/s 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:39.876 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:40.134 /dev/nbd1 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:40.134 1+0 records in 00:19:40.134 1+0 records out 00:19:40.134 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275451 s, 14.9 MB/s 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:40.134 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:40.391 13:42:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 65243 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 65243 ']' 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 65243 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 65243 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 65243' 00:19:40.649 killing process with pid 65243 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 65243 00:19:40.649 Received shutdown signal, test time was about 11.414239 seconds 00:19:40.649 00:19:40.649 Latency(us) 00:19:40.649 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:40.649 =================================================================================================================== 00:19:40.649 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:40.649 [2024-07-15 13:42:28.098724] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:40.649 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 65243 00:19:40.649 [2024-07-15 13:42:28.118686] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:19:40.907 00:19:40.907 real 0m14.986s 00:19:40.907 user 0m21.963s 00:19:40.907 sys 0m2.304s 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:40.907 ************************************ 00:19:40.907 END TEST raid_rebuild_test_io 00:19:40.907 ************************************ 00:19:40.907 13:42:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:40.907 13:42:28 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:19:40.907 13:42:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:40.907 13:42:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:40.907 13:42:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:40.907 ************************************ 00:19:40.907 START TEST raid_rebuild_test_sb_io 00:19:40.907 ************************************ 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=67446 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 67446 /var/tmp/spdk-raid.sock 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 67446 ']' 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:40.907 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:40.907 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:40.907 [2024-07-15 13:42:28.446211] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:19:40.907 [2024-07-15 13:42:28.446267] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid67446 ] 00:19:40.907 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:40.907 Zero copy mechanism will not be used. 00:19:41.165 [2024-07-15 13:42:28.534001] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:41.165 [2024-07-15 13:42:28.623380] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:41.165 [2024-07-15 13:42:28.679514] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:41.165 [2024-07-15 13:42:28.679542] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:41.728 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:41.728 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:19:41.728 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:41.728 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:41.985 BaseBdev1_malloc 00:19:41.985 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:41.985 [2024-07-15 13:42:29.559932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:41.985 [2024-07-15 13:42:29.559972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.985 [2024-07-15 13:42:29.559991] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22c9600 00:19:41.985 [2024-07-15 13:42:29.560006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.985 [2024-07-15 13:42:29.561281] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.985 [2024-07-15 13:42:29.561307] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:41.985 BaseBdev1 00:19:41.985 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:41.985 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:42.241 BaseBdev2_malloc 00:19:42.241 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:42.498 [2024-07-15 13:42:29.928971] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:42.498 [2024-07-15 13:42:29.929014] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:42.498 [2024-07-15 13:42:29.929034] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22ca120 00:19:42.498 [2024-07-15 13:42:29.929042] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:42.498 [2024-07-15 13:42:29.930243] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:42.498 [2024-07-15 13:42:29.930268] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:42.498 BaseBdev2 00:19:42.498 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:42.498 spare_malloc 00:19:42.755 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:42.755 spare_delay 00:19:42.755 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:43.013 [2024-07-15 13:42:30.462004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:43.013 [2024-07-15 13:42:30.462043] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:43.013 [2024-07-15 13:42:30.462060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2478780 00:19:43.013 [2024-07-15 13:42:30.462068] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:43.013 [2024-07-15 13:42:30.463236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:43.013 [2024-07-15 13:42:30.463260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:43.013 spare 00:19:43.013 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:43.013 [2024-07-15 13:42:30.626444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:43.013 [2024-07-15 13:42:30.627444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:43.013 [2024-07-15 13:42:30.627591] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2479930 00:19:43.013 [2024-07-15 13:42:30.627601] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:43.013 [2024-07-15 13:42:30.627751] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2472d50 00:19:43.013 [2024-07-15 13:42:30.627862] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2479930 00:19:43.013 [2024-07-15 13:42:30.627871] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2479930 00:19:43.013 [2024-07-15 13:42:30.627953] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:43.270 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.270 "name": "raid_bdev1", 00:19:43.270 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:43.270 "strip_size_kb": 0, 00:19:43.270 "state": "online", 00:19:43.270 "raid_level": "raid1", 00:19:43.270 "superblock": true, 00:19:43.270 "num_base_bdevs": 2, 00:19:43.270 "num_base_bdevs_discovered": 2, 00:19:43.270 "num_base_bdevs_operational": 2, 00:19:43.270 "base_bdevs_list": [ 00:19:43.270 { 00:19:43.271 "name": "BaseBdev1", 00:19:43.271 "uuid": "a9b08c52-670d-53cb-9ca3-1ca037b1887f", 00:19:43.271 "is_configured": true, 00:19:43.271 "data_offset": 2048, 00:19:43.271 "data_size": 63488 00:19:43.271 }, 00:19:43.271 { 00:19:43.271 "name": "BaseBdev2", 00:19:43.271 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:43.271 "is_configured": true, 00:19:43.271 "data_offset": 2048, 00:19:43.271 "data_size": 63488 00:19:43.271 } 00:19:43.271 ] 00:19:43.271 }' 00:19:43.271 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.271 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:43.834 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:43.834 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:43.834 [2024-07-15 13:42:31.420633] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:43.834 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:19:43.834 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:43.834 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.098 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:19:44.098 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:19:44.098 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:44.098 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:44.098 [2024-07-15 13:42:31.695047] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x247a510 00:19:44.098 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:44.098 Zero copy mechanism will not be used. 00:19:44.098 Running I/O for 60 seconds... 00:19:44.356 [2024-07-15 13:42:31.789071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:44.356 [2024-07-15 13:42:31.794170] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x247a510 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.356 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:44.612 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.612 "name": "raid_bdev1", 00:19:44.612 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:44.612 "strip_size_kb": 0, 00:19:44.612 "state": "online", 00:19:44.612 "raid_level": "raid1", 00:19:44.612 "superblock": true, 00:19:44.612 "num_base_bdevs": 2, 00:19:44.612 "num_base_bdevs_discovered": 1, 00:19:44.612 "num_base_bdevs_operational": 1, 00:19:44.612 "base_bdevs_list": [ 00:19:44.612 { 00:19:44.612 "name": null, 00:19:44.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.612 "is_configured": false, 00:19:44.612 "data_offset": 2048, 00:19:44.612 "data_size": 63488 00:19:44.612 }, 00:19:44.612 { 00:19:44.612 "name": "BaseBdev2", 00:19:44.612 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:44.612 "is_configured": true, 00:19:44.612 "data_offset": 2048, 00:19:44.612 "data_size": 63488 00:19:44.612 } 00:19:44.612 ] 00:19:44.612 }' 00:19:44.612 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.612 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:45.208 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:45.208 [2024-07-15 13:42:32.676077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:45.208 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:45.208 [2024-07-15 13:42:32.722238] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23fca50 00:19:45.208 [2024-07-15 13:42:32.723944] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:45.465 [2024-07-15 13:42:32.837517] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:45.465 [2024-07-15 13:42:32.837805] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:45.465 [2024-07-15 13:42:33.045583] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:45.465 [2024-07-15 13:42:33.045765] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:46.028 [2024-07-15 13:42:33.399724] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:46.028 [2024-07-15 13:42:33.507442] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:46.285 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:46.285 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:46.285 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:46.285 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:46.285 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:46.285 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.285 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.542 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:46.542 "name": "raid_bdev1", 00:19:46.542 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:46.542 "strip_size_kb": 0, 00:19:46.542 "state": "online", 00:19:46.542 "raid_level": "raid1", 00:19:46.542 "superblock": true, 00:19:46.542 "num_base_bdevs": 2, 00:19:46.542 "num_base_bdevs_discovered": 2, 00:19:46.542 "num_base_bdevs_operational": 2, 00:19:46.542 "process": { 00:19:46.542 "type": "rebuild", 00:19:46.542 "target": "spare", 00:19:46.542 "progress": { 00:19:46.542 "blocks": 14336, 00:19:46.542 "percent": 22 00:19:46.542 } 00:19:46.542 }, 00:19:46.542 "base_bdevs_list": [ 00:19:46.542 { 00:19:46.542 "name": "spare", 00:19:46.542 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:46.542 "is_configured": true, 00:19:46.542 "data_offset": 2048, 00:19:46.542 "data_size": 63488 00:19:46.542 }, 00:19:46.542 { 00:19:46.542 "name": "BaseBdev2", 00:19:46.542 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:46.542 "is_configured": true, 00:19:46.542 "data_offset": 2048, 00:19:46.542 "data_size": 63488 00:19:46.542 } 00:19:46.542 ] 00:19:46.542 }' 00:19:46.542 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:46.542 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:46.542 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:46.542 [2024-07-15 13:42:33.958054] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:46.542 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:46.542 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:46.542 [2024-07-15 13:42:34.138857] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:46.799 [2024-07-15 13:42:34.191095] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:19:46.799 [2024-07-15 13:42:34.298166] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:46.799 [2024-07-15 13:42:34.305088] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:46.799 [2024-07-15 13:42:34.305110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:46.799 [2024-07-15 13:42:34.305117] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:46.799 [2024-07-15 13:42:34.326316] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x247a510 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.799 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.056 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.056 "name": "raid_bdev1", 00:19:47.056 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:47.056 "strip_size_kb": 0, 00:19:47.056 "state": "online", 00:19:47.056 "raid_level": "raid1", 00:19:47.056 "superblock": true, 00:19:47.056 "num_base_bdevs": 2, 00:19:47.056 "num_base_bdevs_discovered": 1, 00:19:47.056 "num_base_bdevs_operational": 1, 00:19:47.056 "base_bdevs_list": [ 00:19:47.056 { 00:19:47.056 "name": null, 00:19:47.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.056 "is_configured": false, 00:19:47.056 "data_offset": 2048, 00:19:47.056 "data_size": 63488 00:19:47.056 }, 00:19:47.056 { 00:19:47.056 "name": "BaseBdev2", 00:19:47.056 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:47.056 "is_configured": true, 00:19:47.056 "data_offset": 2048, 00:19:47.056 "data_size": 63488 00:19:47.056 } 00:19:47.056 ] 00:19:47.056 }' 00:19:47.056 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.056 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:47.620 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:47.620 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:47.620 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:47.620 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:47.620 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:47.620 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.620 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.877 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:47.877 "name": "raid_bdev1", 00:19:47.877 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:47.877 "strip_size_kb": 0, 00:19:47.877 "state": "online", 00:19:47.877 "raid_level": "raid1", 00:19:47.877 "superblock": true, 00:19:47.877 "num_base_bdevs": 2, 00:19:47.877 "num_base_bdevs_discovered": 1, 00:19:47.877 "num_base_bdevs_operational": 1, 00:19:47.877 "base_bdevs_list": [ 00:19:47.877 { 00:19:47.877 "name": null, 00:19:47.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.877 "is_configured": false, 00:19:47.877 "data_offset": 2048, 00:19:47.877 "data_size": 63488 00:19:47.877 }, 00:19:47.877 { 00:19:47.877 "name": "BaseBdev2", 00:19:47.877 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:47.877 "is_configured": true, 00:19:47.877 "data_offset": 2048, 00:19:47.877 "data_size": 63488 00:19:47.877 } 00:19:47.877 ] 00:19:47.877 }' 00:19:47.877 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:47.877 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:47.877 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:47.877 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:47.877 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:47.877 [2024-07-15 13:42:35.475627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:48.134 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:48.134 [2024-07-15 13:42:35.529080] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22dbc00 00:19:48.134 [2024-07-15 13:42:35.530202] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:48.134 [2024-07-15 13:42:35.638031] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:48.134 [2024-07-15 13:42:35.638447] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:48.392 [2024-07-15 13:42:35.845101] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:48.392 [2024-07-15 13:42:35.845202] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:48.957 [2024-07-15 13:42:36.327957] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:48.957 [2024-07-15 13:42:36.328180] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:48.957 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:48.957 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:48.957 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:48.957 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:48.957 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:48.957 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.957 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.214 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:49.214 "name": "raid_bdev1", 00:19:49.214 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:49.214 "strip_size_kb": 0, 00:19:49.214 "state": "online", 00:19:49.214 "raid_level": "raid1", 00:19:49.214 "superblock": true, 00:19:49.214 "num_base_bdevs": 2, 00:19:49.214 "num_base_bdevs_discovered": 2, 00:19:49.215 "num_base_bdevs_operational": 2, 00:19:49.215 "process": { 00:19:49.215 "type": "rebuild", 00:19:49.215 "target": "spare", 00:19:49.215 "progress": { 00:19:49.215 "blocks": 16384, 00:19:49.215 "percent": 25 00:19:49.215 } 00:19:49.215 }, 00:19:49.215 "base_bdevs_list": [ 00:19:49.215 { 00:19:49.215 "name": "spare", 00:19:49.215 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:49.215 "is_configured": true, 00:19:49.215 "data_offset": 2048, 00:19:49.215 "data_size": 63488 00:19:49.215 }, 00:19:49.215 { 00:19:49.215 "name": "BaseBdev2", 00:19:49.215 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:49.215 "is_configured": true, 00:19:49.215 "data_offset": 2048, 00:19:49.215 "data_size": 63488 00:19:49.215 } 00:19:49.215 ] 00:19:49.215 }' 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:19:49.215 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=656 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.215 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.473 [2024-07-15 13:42:36.883688] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:19:49.473 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:49.473 "name": "raid_bdev1", 00:19:49.473 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:49.473 "strip_size_kb": 0, 00:19:49.473 "state": "online", 00:19:49.473 "raid_level": "raid1", 00:19:49.473 "superblock": true, 00:19:49.473 "num_base_bdevs": 2, 00:19:49.473 "num_base_bdevs_discovered": 2, 00:19:49.473 "num_base_bdevs_operational": 2, 00:19:49.473 "process": { 00:19:49.473 "type": "rebuild", 00:19:49.473 "target": "spare", 00:19:49.473 "progress": { 00:19:49.473 "blocks": 20480, 00:19:49.473 "percent": 32 00:19:49.473 } 00:19:49.473 }, 00:19:49.473 "base_bdevs_list": [ 00:19:49.473 { 00:19:49.473 "name": "spare", 00:19:49.473 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:49.473 "is_configured": true, 00:19:49.473 "data_offset": 2048, 00:19:49.473 "data_size": 63488 00:19:49.473 }, 00:19:49.473 { 00:19:49.473 "name": "BaseBdev2", 00:19:49.473 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:49.473 "is_configured": true, 00:19:49.473 "data_offset": 2048, 00:19:49.473 "data_size": 63488 00:19:49.473 } 00:19:49.473 ] 00:19:49.473 }' 00:19:49.473 13:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:49.473 13:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:49.473 13:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:49.473 13:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:49.473 13:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:49.731 [2024-07-15 13:42:37.103409] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:19:49.731 [2024-07-15 13:42:37.103685] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:19:49.988 [2024-07-15 13:42:37.530880] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:19:49.988 [2024-07-15 13:42:37.531047] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:19:50.553 [2024-07-15 13:42:37.960732] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.553 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.810 [2024-07-15 13:42:38.187866] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:19:50.810 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:50.810 "name": "raid_bdev1", 00:19:50.810 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:50.810 "strip_size_kb": 0, 00:19:50.810 "state": "online", 00:19:50.810 "raid_level": "raid1", 00:19:50.810 "superblock": true, 00:19:50.810 "num_base_bdevs": 2, 00:19:50.810 "num_base_bdevs_discovered": 2, 00:19:50.810 "num_base_bdevs_operational": 2, 00:19:50.810 "process": { 00:19:50.810 "type": "rebuild", 00:19:50.810 "target": "spare", 00:19:50.810 "progress": { 00:19:50.810 "blocks": 38912, 00:19:50.810 "percent": 61 00:19:50.810 } 00:19:50.810 }, 00:19:50.810 "base_bdevs_list": [ 00:19:50.810 { 00:19:50.810 "name": "spare", 00:19:50.810 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:50.810 "is_configured": true, 00:19:50.810 "data_offset": 2048, 00:19:50.810 "data_size": 63488 00:19:50.810 }, 00:19:50.810 { 00:19:50.810 "name": "BaseBdev2", 00:19:50.810 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:50.810 "is_configured": true, 00:19:50.810 "data_offset": 2048, 00:19:50.810 "data_size": 63488 00:19:50.810 } 00:19:50.810 ] 00:19:50.810 }' 00:19:50.810 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:50.810 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:50.810 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:50.810 [2024-07-15 13:42:38.294607] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:19:50.810 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:50.810 13:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:51.375 [2024-07-15 13:42:38.860448] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:19:51.375 [2024-07-15 13:42:38.860713] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:51.940 "name": "raid_bdev1", 00:19:51.940 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:51.940 "strip_size_kb": 0, 00:19:51.940 "state": "online", 00:19:51.940 "raid_level": "raid1", 00:19:51.940 "superblock": true, 00:19:51.940 "num_base_bdevs": 2, 00:19:51.940 "num_base_bdevs_discovered": 2, 00:19:51.940 "num_base_bdevs_operational": 2, 00:19:51.940 "process": { 00:19:51.940 "type": "rebuild", 00:19:51.940 "target": "spare", 00:19:51.940 "progress": { 00:19:51.940 "blocks": 59392, 00:19:51.940 "percent": 93 00:19:51.940 } 00:19:51.940 }, 00:19:51.940 "base_bdevs_list": [ 00:19:51.940 { 00:19:51.940 "name": "spare", 00:19:51.940 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:51.940 "is_configured": true, 00:19:51.940 "data_offset": 2048, 00:19:51.940 "data_size": 63488 00:19:51.940 }, 00:19:51.940 { 00:19:51.940 "name": "BaseBdev2", 00:19:51.940 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:51.940 "is_configured": true, 00:19:51.940 "data_offset": 2048, 00:19:51.940 "data_size": 63488 00:19:51.940 } 00:19:51.940 ] 00:19:51.940 }' 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:51.940 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:52.197 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:52.197 13:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:52.197 [2024-07-15 13:42:39.635291] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:52.197 [2024-07-15 13:42:39.740554] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:52.197 [2024-07-15 13:42:39.742004] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.127 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:53.384 "name": "raid_bdev1", 00:19:53.384 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:53.384 "strip_size_kb": 0, 00:19:53.384 "state": "online", 00:19:53.384 "raid_level": "raid1", 00:19:53.384 "superblock": true, 00:19:53.384 "num_base_bdevs": 2, 00:19:53.384 "num_base_bdevs_discovered": 2, 00:19:53.384 "num_base_bdevs_operational": 2, 00:19:53.384 "base_bdevs_list": [ 00:19:53.384 { 00:19:53.384 "name": "spare", 00:19:53.384 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:53.384 "is_configured": true, 00:19:53.384 "data_offset": 2048, 00:19:53.384 "data_size": 63488 00:19:53.384 }, 00:19:53.384 { 00:19:53.384 "name": "BaseBdev2", 00:19:53.384 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:53.384 "is_configured": true, 00:19:53.384 "data_offset": 2048, 00:19:53.384 "data_size": 63488 00:19:53.384 } 00:19:53.384 ] 00:19:53.384 }' 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.384 13:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:53.641 "name": "raid_bdev1", 00:19:53.641 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:53.641 "strip_size_kb": 0, 00:19:53.641 "state": "online", 00:19:53.641 "raid_level": "raid1", 00:19:53.641 "superblock": true, 00:19:53.641 "num_base_bdevs": 2, 00:19:53.641 "num_base_bdevs_discovered": 2, 00:19:53.641 "num_base_bdevs_operational": 2, 00:19:53.641 "base_bdevs_list": [ 00:19:53.641 { 00:19:53.641 "name": "spare", 00:19:53.641 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:53.641 "is_configured": true, 00:19:53.641 "data_offset": 2048, 00:19:53.641 "data_size": 63488 00:19:53.641 }, 00:19:53.641 { 00:19:53.641 "name": "BaseBdev2", 00:19:53.641 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:53.641 "is_configured": true, 00:19:53.641 "data_offset": 2048, 00:19:53.641 "data_size": 63488 00:19:53.641 } 00:19:53.641 ] 00:19:53.641 }' 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.641 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.897 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.897 "name": "raid_bdev1", 00:19:53.897 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:53.897 "strip_size_kb": 0, 00:19:53.897 "state": "online", 00:19:53.897 "raid_level": "raid1", 00:19:53.897 "superblock": true, 00:19:53.897 "num_base_bdevs": 2, 00:19:53.897 "num_base_bdevs_discovered": 2, 00:19:53.897 "num_base_bdevs_operational": 2, 00:19:53.897 "base_bdevs_list": [ 00:19:53.897 { 00:19:53.897 "name": "spare", 00:19:53.897 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:53.897 "is_configured": true, 00:19:53.897 "data_offset": 2048, 00:19:53.897 "data_size": 63488 00:19:53.897 }, 00:19:53.897 { 00:19:53.897 "name": "BaseBdev2", 00:19:53.897 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:53.897 "is_configured": true, 00:19:53.897 "data_offset": 2048, 00:19:53.897 "data_size": 63488 00:19:53.897 } 00:19:53.897 ] 00:19:53.897 }' 00:19:53.897 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.897 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:54.461 13:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:54.461 [2024-07-15 13:42:41.961306] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:54.461 [2024-07-15 13:42:41.961331] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:54.461 00:19:54.461 Latency(us) 00:19:54.461 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:54.461 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:54.461 raid_bdev1 : 10.27 118.09 354.28 0.00 0.00 11788.23 245.76 112152.04 00:19:54.462 =================================================================================================================== 00:19:54.462 Total : 118.09 354.28 0.00 0.00 11788.23 245.76 112152.04 00:19:54.462 [2024-07-15 13:42:41.996277] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:54.462 [2024-07-15 13:42:41.996300] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:54.462 [2024-07-15 13:42:41.996352] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:54.462 [2024-07-15 13:42:41.996361] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2479930 name raid_bdev1, state offline 00:19:54.462 0 00:19:54.462 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.462 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:54.719 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:54.976 /dev/nbd0 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:54.976 1+0 records in 00:19:54.976 1+0 records out 00:19:54.976 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273539 s, 15.0 MB/s 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:54.976 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:54.976 /dev/nbd1 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:55.233 1+0 records in 00:19:55.233 1+0 records out 00:19:55.233 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286122 s, 14.3 MB/s 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:55.233 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:55.491 13:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:19:55.491 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:55.748 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:56.007 [2024-07-15 13:42:43.392056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:56.007 [2024-07-15 13:42:43.392098] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:56.007 [2024-07-15 13:42:43.392117] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22da7f0 00:19:56.007 [2024-07-15 13:42:43.392126] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:56.007 [2024-07-15 13:42:43.393320] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:56.007 [2024-07-15 13:42:43.393345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:56.007 [2024-07-15 13:42:43.393405] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:56.007 [2024-07-15 13:42:43.393425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:56.007 [2024-07-15 13:42:43.393495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:56.007 spare 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.007 [2024-07-15 13:42:43.493787] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22c85c0 00:19:56.007 [2024-07-15 13:42:43.493804] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:56.007 [2024-07-15 13:42:43.493966] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2472810 00:19:56.007 [2024-07-15 13:42:43.494094] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22c85c0 00:19:56.007 [2024-07-15 13:42:43.494102] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22c85c0 00:19:56.007 [2024-07-15 13:42:43.494197] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.007 "name": "raid_bdev1", 00:19:56.007 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:56.007 "strip_size_kb": 0, 00:19:56.007 "state": "online", 00:19:56.007 "raid_level": "raid1", 00:19:56.007 "superblock": true, 00:19:56.007 "num_base_bdevs": 2, 00:19:56.007 "num_base_bdevs_discovered": 2, 00:19:56.007 "num_base_bdevs_operational": 2, 00:19:56.007 "base_bdevs_list": [ 00:19:56.007 { 00:19:56.007 "name": "spare", 00:19:56.007 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:56.007 "is_configured": true, 00:19:56.007 "data_offset": 2048, 00:19:56.007 "data_size": 63488 00:19:56.007 }, 00:19:56.007 { 00:19:56.007 "name": "BaseBdev2", 00:19:56.007 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:56.007 "is_configured": true, 00:19:56.007 "data_offset": 2048, 00:19:56.007 "data_size": 63488 00:19:56.007 } 00:19:56.007 ] 00:19:56.007 }' 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.007 13:42:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:56.574 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:56.574 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:56.574 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:56.574 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:56.574 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:56.574 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.574 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.832 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:56.832 "name": "raid_bdev1", 00:19:56.832 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:56.832 "strip_size_kb": 0, 00:19:56.832 "state": "online", 00:19:56.832 "raid_level": "raid1", 00:19:56.832 "superblock": true, 00:19:56.832 "num_base_bdevs": 2, 00:19:56.832 "num_base_bdevs_discovered": 2, 00:19:56.832 "num_base_bdevs_operational": 2, 00:19:56.832 "base_bdevs_list": [ 00:19:56.832 { 00:19:56.832 "name": "spare", 00:19:56.832 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:56.832 "is_configured": true, 00:19:56.832 "data_offset": 2048, 00:19:56.832 "data_size": 63488 00:19:56.832 }, 00:19:56.832 { 00:19:56.832 "name": "BaseBdev2", 00:19:56.832 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:56.832 "is_configured": true, 00:19:56.832 "data_offset": 2048, 00:19:56.832 "data_size": 63488 00:19:56.832 } 00:19:56.832 ] 00:19:56.832 }' 00:19:56.832 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:56.832 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:56.832 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:56.832 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:56.832 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.832 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:19:57.090 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:19:57.090 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:57.090 [2024-07-15 13:42:44.671543] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:57.090 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:57.090 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:57.090 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:57.090 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:57.090 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:57.091 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:57.091 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:57.091 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:57.091 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:57.091 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:57.091 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.091 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:57.349 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:57.349 "name": "raid_bdev1", 00:19:57.349 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:57.349 "strip_size_kb": 0, 00:19:57.349 "state": "online", 00:19:57.349 "raid_level": "raid1", 00:19:57.349 "superblock": true, 00:19:57.349 "num_base_bdevs": 2, 00:19:57.349 "num_base_bdevs_discovered": 1, 00:19:57.350 "num_base_bdevs_operational": 1, 00:19:57.350 "base_bdevs_list": [ 00:19:57.350 { 00:19:57.350 "name": null, 00:19:57.350 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.350 "is_configured": false, 00:19:57.350 "data_offset": 2048, 00:19:57.350 "data_size": 63488 00:19:57.350 }, 00:19:57.350 { 00:19:57.350 "name": "BaseBdev2", 00:19:57.350 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:57.350 "is_configured": true, 00:19:57.350 "data_offset": 2048, 00:19:57.350 "data_size": 63488 00:19:57.350 } 00:19:57.350 ] 00:19:57.350 }' 00:19:57.350 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:57.350 13:42:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:57.915 13:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:57.915 [2024-07-15 13:42:45.517818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:57.915 [2024-07-15 13:42:45.517939] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:57.915 [2024-07-15 13:42:45.517950] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:57.915 [2024-07-15 13:42:45.517971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:57.915 [2024-07-15 13:42:45.522946] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2472810 00:19:57.915 [2024-07-15 13:42:45.524641] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:58.174 13:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.179 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:59.179 "name": "raid_bdev1", 00:19:59.179 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:59.179 "strip_size_kb": 0, 00:19:59.179 "state": "online", 00:19:59.179 "raid_level": "raid1", 00:19:59.179 "superblock": true, 00:19:59.179 "num_base_bdevs": 2, 00:19:59.179 "num_base_bdevs_discovered": 2, 00:19:59.179 "num_base_bdevs_operational": 2, 00:19:59.179 "process": { 00:19:59.179 "type": "rebuild", 00:19:59.179 "target": "spare", 00:19:59.179 "progress": { 00:19:59.179 "blocks": 22528, 00:19:59.179 "percent": 35 00:19:59.179 } 00:19:59.179 }, 00:19:59.179 "base_bdevs_list": [ 00:19:59.179 { 00:19:59.179 "name": "spare", 00:19:59.179 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:19:59.179 "is_configured": true, 00:19:59.179 "data_offset": 2048, 00:19:59.179 "data_size": 63488 00:19:59.179 }, 00:19:59.179 { 00:19:59.179 "name": "BaseBdev2", 00:19:59.179 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:59.179 "is_configured": true, 00:19:59.179 "data_offset": 2048, 00:19:59.179 "data_size": 63488 00:19:59.179 } 00:19:59.179 ] 00:19:59.180 }' 00:19:59.180 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:59.180 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:59.180 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:59.438 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:59.438 13:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:59.438 [2024-07-15 13:42:46.954790] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:59.438 [2024-07-15 13:42:47.035796] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:59.438 [2024-07-15 13:42:47.035831] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:59.438 [2024-07-15 13:42:47.035841] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:59.438 [2024-07-15 13:42:47.035847] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:59.696 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:59.696 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.696 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:59.696 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:59.696 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:59.696 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.697 "name": "raid_bdev1", 00:19:59.697 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:19:59.697 "strip_size_kb": 0, 00:19:59.697 "state": "online", 00:19:59.697 "raid_level": "raid1", 00:19:59.697 "superblock": true, 00:19:59.697 "num_base_bdevs": 2, 00:19:59.697 "num_base_bdevs_discovered": 1, 00:19:59.697 "num_base_bdevs_operational": 1, 00:19:59.697 "base_bdevs_list": [ 00:19:59.697 { 00:19:59.697 "name": null, 00:19:59.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.697 "is_configured": false, 00:19:59.697 "data_offset": 2048, 00:19:59.697 "data_size": 63488 00:19:59.697 }, 00:19:59.697 { 00:19:59.697 "name": "BaseBdev2", 00:19:59.697 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:19:59.697 "is_configured": true, 00:19:59.697 "data_offset": 2048, 00:19:59.697 "data_size": 63488 00:19:59.697 } 00:19:59.697 ] 00:19:59.697 }' 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.697 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:00.262 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:00.262 [2024-07-15 13:42:47.878435] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:00.262 [2024-07-15 13:42:47.878483] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.262 [2024-07-15 13:42:47.878507] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2478180 00:20:00.262 [2024-07-15 13:42:47.878519] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.262 [2024-07-15 13:42:47.878812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.262 [2024-07-15 13:42:47.878828] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:00.262 [2024-07-15 13:42:47.878897] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:00.262 [2024-07-15 13:42:47.878907] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:00.262 [2024-07-15 13:42:47.878916] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:00.262 [2024-07-15 13:42:47.878932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:00.520 [2024-07-15 13:42:47.884014] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2472810 00:20:00.520 spare 00:20:00.520 [2024-07-15 13:42:47.885100] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:00.520 13:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:01.452 13:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:01.452 13:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:01.452 13:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:01.452 13:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:01.452 13:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:01.452 13:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.452 13:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:01.710 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:01.710 "name": "raid_bdev1", 00:20:01.710 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:20:01.710 "strip_size_kb": 0, 00:20:01.710 "state": "online", 00:20:01.710 "raid_level": "raid1", 00:20:01.710 "superblock": true, 00:20:01.710 "num_base_bdevs": 2, 00:20:01.710 "num_base_bdevs_discovered": 2, 00:20:01.710 "num_base_bdevs_operational": 2, 00:20:01.710 "process": { 00:20:01.710 "type": "rebuild", 00:20:01.710 "target": "spare", 00:20:01.710 "progress": { 00:20:01.710 "blocks": 22528, 00:20:01.710 "percent": 35 00:20:01.710 } 00:20:01.710 }, 00:20:01.710 "base_bdevs_list": [ 00:20:01.710 { 00:20:01.710 "name": "spare", 00:20:01.710 "uuid": "84fd4d61-0e63-52b3-9ec2-6a100759f2c9", 00:20:01.710 "is_configured": true, 00:20:01.710 "data_offset": 2048, 00:20:01.710 "data_size": 63488 00:20:01.710 }, 00:20:01.710 { 00:20:01.710 "name": "BaseBdev2", 00:20:01.710 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:20:01.710 "is_configured": true, 00:20:01.710 "data_offset": 2048, 00:20:01.710 "data_size": 63488 00:20:01.710 } 00:20:01.710 ] 00:20:01.710 }' 00:20:01.710 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:01.710 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:01.710 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:01.710 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:01.710 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:01.966 [2024-07-15 13:42:49.339799] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:01.966 [2024-07-15 13:42:49.395914] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:01.966 [2024-07-15 13:42:49.395947] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:01.966 [2024-07-15 13:42:49.395957] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:01.966 [2024-07-15 13:42:49.395962] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.966 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.224 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.224 "name": "raid_bdev1", 00:20:02.224 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:20:02.224 "strip_size_kb": 0, 00:20:02.224 "state": "online", 00:20:02.224 "raid_level": "raid1", 00:20:02.224 "superblock": true, 00:20:02.224 "num_base_bdevs": 2, 00:20:02.224 "num_base_bdevs_discovered": 1, 00:20:02.224 "num_base_bdevs_operational": 1, 00:20:02.224 "base_bdevs_list": [ 00:20:02.224 { 00:20:02.224 "name": null, 00:20:02.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.224 "is_configured": false, 00:20:02.224 "data_offset": 2048, 00:20:02.224 "data_size": 63488 00:20:02.224 }, 00:20:02.224 { 00:20:02.224 "name": "BaseBdev2", 00:20:02.224 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:20:02.224 "is_configured": true, 00:20:02.224 "data_offset": 2048, 00:20:02.224 "data_size": 63488 00:20:02.224 } 00:20:02.224 ] 00:20:02.224 }' 00:20:02.224 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.224 13:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:02.482 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:02.482 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:02.482 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:02.482 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:02.482 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:02.482 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.740 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.740 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:02.740 "name": "raid_bdev1", 00:20:02.740 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:20:02.740 "strip_size_kb": 0, 00:20:02.740 "state": "online", 00:20:02.740 "raid_level": "raid1", 00:20:02.740 "superblock": true, 00:20:02.740 "num_base_bdevs": 2, 00:20:02.740 "num_base_bdevs_discovered": 1, 00:20:02.740 "num_base_bdevs_operational": 1, 00:20:02.740 "base_bdevs_list": [ 00:20:02.740 { 00:20:02.740 "name": null, 00:20:02.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.740 "is_configured": false, 00:20:02.740 "data_offset": 2048, 00:20:02.740 "data_size": 63488 00:20:02.740 }, 00:20:02.740 { 00:20:02.740 "name": "BaseBdev2", 00:20:02.740 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:20:02.740 "is_configured": true, 00:20:02.740 "data_offset": 2048, 00:20:02.740 "data_size": 63488 00:20:02.740 } 00:20:02.740 ] 00:20:02.740 }' 00:20:02.740 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:02.740 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:02.740 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:02.997 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:02.997 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:02.997 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:03.256 [2024-07-15 13:42:50.692626] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:03.256 [2024-07-15 13:42:50.692664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:03.256 [2024-07-15 13:42:50.692679] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22c8940 00:20:03.256 [2024-07-15 13:42:50.692688] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:03.256 [2024-07-15 13:42:50.692939] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:03.256 [2024-07-15 13:42:50.692952] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:03.256 [2024-07-15 13:42:50.693006] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:03.256 [2024-07-15 13:42:50.693031] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:03.256 [2024-07-15 13:42:50.693039] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:03.256 BaseBdev1 00:20:03.256 13:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.187 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.444 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.444 "name": "raid_bdev1", 00:20:04.444 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:20:04.444 "strip_size_kb": 0, 00:20:04.444 "state": "online", 00:20:04.444 "raid_level": "raid1", 00:20:04.444 "superblock": true, 00:20:04.444 "num_base_bdevs": 2, 00:20:04.444 "num_base_bdevs_discovered": 1, 00:20:04.444 "num_base_bdevs_operational": 1, 00:20:04.444 "base_bdevs_list": [ 00:20:04.444 { 00:20:04.444 "name": null, 00:20:04.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.444 "is_configured": false, 00:20:04.444 "data_offset": 2048, 00:20:04.444 "data_size": 63488 00:20:04.444 }, 00:20:04.444 { 00:20:04.444 "name": "BaseBdev2", 00:20:04.444 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:20:04.444 "is_configured": true, 00:20:04.444 "data_offset": 2048, 00:20:04.444 "data_size": 63488 00:20:04.444 } 00:20:04.444 ] 00:20:04.444 }' 00:20:04.444 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.444 13:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:05.011 "name": "raid_bdev1", 00:20:05.011 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:20:05.011 "strip_size_kb": 0, 00:20:05.011 "state": "online", 00:20:05.011 "raid_level": "raid1", 00:20:05.011 "superblock": true, 00:20:05.011 "num_base_bdevs": 2, 00:20:05.011 "num_base_bdevs_discovered": 1, 00:20:05.011 "num_base_bdevs_operational": 1, 00:20:05.011 "base_bdevs_list": [ 00:20:05.011 { 00:20:05.011 "name": null, 00:20:05.011 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.011 "is_configured": false, 00:20:05.011 "data_offset": 2048, 00:20:05.011 "data_size": 63488 00:20:05.011 }, 00:20:05.011 { 00:20:05.011 "name": "BaseBdev2", 00:20:05.011 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:20:05.011 "is_configured": true, 00:20:05.011 "data_offset": 2048, 00:20:05.011 "data_size": 63488 00:20:05.011 } 00:20:05.011 ] 00:20:05.011 }' 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:05.011 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:05.269 [2024-07-15 13:42:52.810296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:05.269 [2024-07-15 13:42:52.810397] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:05.269 [2024-07-15 13:42:52.810407] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:05.269 request: 00:20:05.269 { 00:20:05.269 "base_bdev": "BaseBdev1", 00:20:05.269 "raid_bdev": "raid_bdev1", 00:20:05.269 "method": "bdev_raid_add_base_bdev", 00:20:05.269 "req_id": 1 00:20:05.269 } 00:20:05.269 Got JSON-RPC error response 00:20:05.269 response: 00:20:05.269 { 00:20:05.269 "code": -22, 00:20:05.269 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:05.269 } 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:05.269 13:42:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.643 13:42:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:06.643 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.643 "name": "raid_bdev1", 00:20:06.643 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:20:06.643 "strip_size_kb": 0, 00:20:06.643 "state": "online", 00:20:06.643 "raid_level": "raid1", 00:20:06.643 "superblock": true, 00:20:06.643 "num_base_bdevs": 2, 00:20:06.643 "num_base_bdevs_discovered": 1, 00:20:06.643 "num_base_bdevs_operational": 1, 00:20:06.643 "base_bdevs_list": [ 00:20:06.643 { 00:20:06.643 "name": null, 00:20:06.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.643 "is_configured": false, 00:20:06.643 "data_offset": 2048, 00:20:06.643 "data_size": 63488 00:20:06.643 }, 00:20:06.643 { 00:20:06.643 "name": "BaseBdev2", 00:20:06.643 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:20:06.643 "is_configured": true, 00:20:06.643 "data_offset": 2048, 00:20:06.643 "data_size": 63488 00:20:06.643 } 00:20:06.643 ] 00:20:06.643 }' 00:20:06.643 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.643 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:06.901 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:06.901 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:06.901 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:06.901 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:06.901 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:07.159 "name": "raid_bdev1", 00:20:07.159 "uuid": "7ec3684f-7e03-40f6-ba60-07b13182dc65", 00:20:07.159 "strip_size_kb": 0, 00:20:07.159 "state": "online", 00:20:07.159 "raid_level": "raid1", 00:20:07.159 "superblock": true, 00:20:07.159 "num_base_bdevs": 2, 00:20:07.159 "num_base_bdevs_discovered": 1, 00:20:07.159 "num_base_bdevs_operational": 1, 00:20:07.159 "base_bdevs_list": [ 00:20:07.159 { 00:20:07.159 "name": null, 00:20:07.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.159 "is_configured": false, 00:20:07.159 "data_offset": 2048, 00:20:07.159 "data_size": 63488 00:20:07.159 }, 00:20:07.159 { 00:20:07.159 "name": "BaseBdev2", 00:20:07.159 "uuid": "8067ed97-3d67-5fb3-9154-139c18597624", 00:20:07.159 "is_configured": true, 00:20:07.159 "data_offset": 2048, 00:20:07.159 "data_size": 63488 00:20:07.159 } 00:20:07.159 ] 00:20:07.159 }' 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 67446 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 67446 ']' 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 67446 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:07.159 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 67446 00:20:07.417 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:07.417 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:07.417 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 67446' 00:20:07.417 killing process with pid 67446 00:20:07.417 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 67446 00:20:07.417 Received shutdown signal, test time was about 23.056038 seconds 00:20:07.417 00:20:07.417 Latency(us) 00:20:07.417 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:07.417 =================================================================================================================== 00:20:07.417 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:07.417 [2024-07-15 13:42:54.808999] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:07.417 13:42:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 67446 00:20:07.417 [2024-07-15 13:42:54.809074] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:07.417 [2024-07-15 13:42:54.809109] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:07.417 [2024-07-15 13:42:54.809117] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22c85c0 name raid_bdev1, state offline 00:20:07.417 [2024-07-15 13:42:54.832240] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:07.675 13:42:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:20:07.675 00:20:07.675 real 0m26.657s 00:20:07.675 user 0m40.181s 00:20:07.675 sys 0m3.830s 00:20:07.675 13:42:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:07.675 13:42:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:07.675 ************************************ 00:20:07.675 END TEST raid_rebuild_test_sb_io 00:20:07.675 ************************************ 00:20:07.675 13:42:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:07.675 13:42:55 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:20:07.676 13:42:55 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:20:07.676 13:42:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:07.676 13:42:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:07.676 13:42:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:07.676 ************************************ 00:20:07.676 START TEST raid_rebuild_test 00:20:07.676 ************************************ 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=71351 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 71351 /var/tmp/spdk-raid.sock 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 71351 ']' 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:07.676 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.676 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:07.676 [2024-07-15 13:42:55.174656] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:20:07.676 [2024-07-15 13:42:55.174707] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71351 ] 00:20:07.676 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:07.676 Zero copy mechanism will not be used. 00:20:07.676 [2024-07-15 13:42:55.260798] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:07.933 [2024-07-15 13:42:55.348951] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:07.933 [2024-07-15 13:42:55.399975] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:07.933 [2024-07-15 13:42:55.400019] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:08.497 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:08.497 13:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:20:08.497 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:08.497 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:08.755 BaseBdev1_malloc 00:20:08.755 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:08.755 [2024-07-15 13:42:56.318434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:08.755 [2024-07-15 13:42:56.318471] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:08.755 [2024-07-15 13:42:56.318489] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf1600 00:20:08.755 [2024-07-15 13:42:56.318498] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:08.755 [2024-07-15 13:42:56.319743] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:08.755 [2024-07-15 13:42:56.319768] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:08.755 BaseBdev1 00:20:08.755 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:08.755 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:09.012 BaseBdev2_malloc 00:20:09.012 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:09.269 [2024-07-15 13:42:56.676686] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:09.269 [2024-07-15 13:42:56.676724] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.270 [2024-07-15 13:42:56.676744] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf2120 00:20:09.270 [2024-07-15 13:42:56.676752] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.270 [2024-07-15 13:42:56.677955] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.270 [2024-07-15 13:42:56.677980] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:09.270 BaseBdev2 00:20:09.270 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:09.270 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:09.270 BaseBdev3_malloc 00:20:09.270 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:09.527 [2024-07-15 13:42:57.017550] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:09.527 [2024-07-15 13:42:57.017589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.527 [2024-07-15 13:42:57.017606] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf9f1b0 00:20:09.527 [2024-07-15 13:42:57.017616] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.527 [2024-07-15 13:42:57.018741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.527 [2024-07-15 13:42:57.018766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:09.527 BaseBdev3 00:20:09.527 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:09.527 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:09.784 BaseBdev4_malloc 00:20:09.784 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:09.784 [2024-07-15 13:42:57.378232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:09.784 [2024-07-15 13:42:57.378271] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.784 [2024-07-15 13:42:57.378288] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf9e390 00:20:09.784 [2024-07-15 13:42:57.378296] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.784 [2024-07-15 13:42:57.379423] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.784 [2024-07-15 13:42:57.379447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:09.784 BaseBdev4 00:20:09.784 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:10.042 spare_malloc 00:20:10.042 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:10.300 spare_delay 00:20:10.300 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:10.300 [2024-07-15 13:42:57.904506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:10.300 [2024-07-15 13:42:57.904545] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:10.300 [2024-07-15 13:42:57.904561] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa2e70 00:20:10.300 [2024-07-15 13:42:57.904570] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:10.300 [2024-07-15 13:42:57.905719] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:10.300 [2024-07-15 13:42:57.905742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:10.300 spare 00:20:10.557 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:10.557 [2024-07-15 13:42:58.080977] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:10.557 [2024-07-15 13:42:58.081806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:10.557 [2024-07-15 13:42:58.081844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:10.557 [2024-07-15 13:42:58.081874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:10.557 [2024-07-15 13:42:58.081930] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf22160 00:20:10.557 [2024-07-15 13:42:58.081938] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:10.558 [2024-07-15 13:42:58.082091] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf9c6d0 00:20:10.558 [2024-07-15 13:42:58.082193] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf22160 00:20:10.558 [2024-07-15 13:42:58.082200] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf22160 00:20:10.558 [2024-07-15 13:42:58.082278] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.558 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.814 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.815 "name": "raid_bdev1", 00:20:10.815 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:10.815 "strip_size_kb": 0, 00:20:10.815 "state": "online", 00:20:10.815 "raid_level": "raid1", 00:20:10.815 "superblock": false, 00:20:10.815 "num_base_bdevs": 4, 00:20:10.815 "num_base_bdevs_discovered": 4, 00:20:10.815 "num_base_bdevs_operational": 4, 00:20:10.815 "base_bdevs_list": [ 00:20:10.815 { 00:20:10.815 "name": "BaseBdev1", 00:20:10.815 "uuid": "bc6e87f3-b376-5def-b88a-93410233e549", 00:20:10.815 "is_configured": true, 00:20:10.815 "data_offset": 0, 00:20:10.815 "data_size": 65536 00:20:10.815 }, 00:20:10.815 { 00:20:10.815 "name": "BaseBdev2", 00:20:10.815 "uuid": "c6931f2e-5531-55a4-b06e-eac6b4e71fba", 00:20:10.815 "is_configured": true, 00:20:10.815 "data_offset": 0, 00:20:10.815 "data_size": 65536 00:20:10.815 }, 00:20:10.815 { 00:20:10.815 "name": "BaseBdev3", 00:20:10.815 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:10.815 "is_configured": true, 00:20:10.815 "data_offset": 0, 00:20:10.815 "data_size": 65536 00:20:10.815 }, 00:20:10.815 { 00:20:10.815 "name": "BaseBdev4", 00:20:10.815 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:10.815 "is_configured": true, 00:20:10.815 "data_offset": 0, 00:20:10.815 "data_size": 65536 00:20:10.815 } 00:20:10.815 ] 00:20:10.815 }' 00:20:10.815 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.815 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.377 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:11.377 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:11.377 [2024-07-15 13:42:58.951453] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:11.377 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:11.377 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.377 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:11.634 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:11.890 [2024-07-15 13:42:59.312204] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf9c6d0 00:20:11.890 /dev/nbd0 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:11.890 1+0 records in 00:20:11.890 1+0 records out 00:20:11.890 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027104 s, 15.1 MB/s 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:11.890 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:20:17.137 65536+0 records in 00:20:17.137 65536+0 records out 00:20:17.137 33554432 bytes (34 MB, 32 MiB) copied, 5.31087 s, 6.3 MB/s 00:20:17.137 13:43:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:17.137 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:17.137 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:17.137 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:17.137 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:17.137 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:17.137 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:17.393 [2024-07-15 13:43:04.894307] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:17.393 13:43:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:17.650 [2024-07-15 13:43:05.078048] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.650 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.906 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.906 "name": "raid_bdev1", 00:20:17.906 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:17.906 "strip_size_kb": 0, 00:20:17.906 "state": "online", 00:20:17.906 "raid_level": "raid1", 00:20:17.906 "superblock": false, 00:20:17.906 "num_base_bdevs": 4, 00:20:17.906 "num_base_bdevs_discovered": 3, 00:20:17.906 "num_base_bdevs_operational": 3, 00:20:17.906 "base_bdevs_list": [ 00:20:17.906 { 00:20:17.906 "name": null, 00:20:17.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.906 "is_configured": false, 00:20:17.906 "data_offset": 0, 00:20:17.906 "data_size": 65536 00:20:17.906 }, 00:20:17.906 { 00:20:17.906 "name": "BaseBdev2", 00:20:17.906 "uuid": "c6931f2e-5531-55a4-b06e-eac6b4e71fba", 00:20:17.906 "is_configured": true, 00:20:17.906 "data_offset": 0, 00:20:17.906 "data_size": 65536 00:20:17.906 }, 00:20:17.906 { 00:20:17.906 "name": "BaseBdev3", 00:20:17.906 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:17.906 "is_configured": true, 00:20:17.906 "data_offset": 0, 00:20:17.906 "data_size": 65536 00:20:17.906 }, 00:20:17.906 { 00:20:17.906 "name": "BaseBdev4", 00:20:17.906 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:17.906 "is_configured": true, 00:20:17.906 "data_offset": 0, 00:20:17.906 "data_size": 65536 00:20:17.906 } 00:20:17.906 ] 00:20:17.906 }' 00:20:17.906 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.906 13:43:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:18.162 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:18.419 [2024-07-15 13:43:05.932267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:18.419 [2024-07-15 13:43:05.936102] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf27f70 00:20:18.419 [2024-07-15 13:43:05.937787] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:18.419 13:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:19.349 13:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:19.349 13:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:19.349 13:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:19.349 13:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:19.349 13:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:19.349 13:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.349 13:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:19.606 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:19.606 "name": "raid_bdev1", 00:20:19.606 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:19.606 "strip_size_kb": 0, 00:20:19.606 "state": "online", 00:20:19.606 "raid_level": "raid1", 00:20:19.606 "superblock": false, 00:20:19.606 "num_base_bdevs": 4, 00:20:19.606 "num_base_bdevs_discovered": 4, 00:20:19.606 "num_base_bdevs_operational": 4, 00:20:19.606 "process": { 00:20:19.606 "type": "rebuild", 00:20:19.606 "target": "spare", 00:20:19.606 "progress": { 00:20:19.606 "blocks": 22528, 00:20:19.606 "percent": 34 00:20:19.606 } 00:20:19.606 }, 00:20:19.606 "base_bdevs_list": [ 00:20:19.606 { 00:20:19.606 "name": "spare", 00:20:19.606 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:19.606 "is_configured": true, 00:20:19.606 "data_offset": 0, 00:20:19.606 "data_size": 65536 00:20:19.606 }, 00:20:19.606 { 00:20:19.606 "name": "BaseBdev2", 00:20:19.606 "uuid": "c6931f2e-5531-55a4-b06e-eac6b4e71fba", 00:20:19.606 "is_configured": true, 00:20:19.606 "data_offset": 0, 00:20:19.606 "data_size": 65536 00:20:19.606 }, 00:20:19.606 { 00:20:19.606 "name": "BaseBdev3", 00:20:19.606 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:19.606 "is_configured": true, 00:20:19.606 "data_offset": 0, 00:20:19.606 "data_size": 65536 00:20:19.606 }, 00:20:19.606 { 00:20:19.606 "name": "BaseBdev4", 00:20:19.606 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:19.606 "is_configured": true, 00:20:19.606 "data_offset": 0, 00:20:19.606 "data_size": 65536 00:20:19.606 } 00:20:19.606 ] 00:20:19.606 }' 00:20:19.606 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:19.606 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:19.606 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:19.606 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:19.606 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:19.863 [2024-07-15 13:43:07.374187] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:19.863 [2024-07-15 13:43:07.449031] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:19.863 [2024-07-15 13:43:07.449068] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:19.863 [2024-07-15 13:43:07.449079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:19.863 [2024-07-15 13:43:07.449085] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.863 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.120 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.120 "name": "raid_bdev1", 00:20:20.120 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:20.120 "strip_size_kb": 0, 00:20:20.120 "state": "online", 00:20:20.120 "raid_level": "raid1", 00:20:20.120 "superblock": false, 00:20:20.120 "num_base_bdevs": 4, 00:20:20.120 "num_base_bdevs_discovered": 3, 00:20:20.120 "num_base_bdevs_operational": 3, 00:20:20.120 "base_bdevs_list": [ 00:20:20.120 { 00:20:20.120 "name": null, 00:20:20.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.120 "is_configured": false, 00:20:20.120 "data_offset": 0, 00:20:20.120 "data_size": 65536 00:20:20.120 }, 00:20:20.120 { 00:20:20.120 "name": "BaseBdev2", 00:20:20.120 "uuid": "c6931f2e-5531-55a4-b06e-eac6b4e71fba", 00:20:20.120 "is_configured": true, 00:20:20.120 "data_offset": 0, 00:20:20.120 "data_size": 65536 00:20:20.120 }, 00:20:20.120 { 00:20:20.120 "name": "BaseBdev3", 00:20:20.120 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:20.120 "is_configured": true, 00:20:20.120 "data_offset": 0, 00:20:20.120 "data_size": 65536 00:20:20.120 }, 00:20:20.120 { 00:20:20.120 "name": "BaseBdev4", 00:20:20.120 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:20.120 "is_configured": true, 00:20:20.120 "data_offset": 0, 00:20:20.120 "data_size": 65536 00:20:20.120 } 00:20:20.120 ] 00:20:20.120 }' 00:20:20.120 13:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.120 13:43:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.683 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:20.683 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:20.683 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:20.683 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:20.683 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:20.683 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.683 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.941 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:20.941 "name": "raid_bdev1", 00:20:20.941 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:20.941 "strip_size_kb": 0, 00:20:20.941 "state": "online", 00:20:20.941 "raid_level": "raid1", 00:20:20.941 "superblock": false, 00:20:20.941 "num_base_bdevs": 4, 00:20:20.941 "num_base_bdevs_discovered": 3, 00:20:20.941 "num_base_bdevs_operational": 3, 00:20:20.941 "base_bdevs_list": [ 00:20:20.941 { 00:20:20.941 "name": null, 00:20:20.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.941 "is_configured": false, 00:20:20.941 "data_offset": 0, 00:20:20.941 "data_size": 65536 00:20:20.941 }, 00:20:20.941 { 00:20:20.941 "name": "BaseBdev2", 00:20:20.941 "uuid": "c6931f2e-5531-55a4-b06e-eac6b4e71fba", 00:20:20.941 "is_configured": true, 00:20:20.941 "data_offset": 0, 00:20:20.941 "data_size": 65536 00:20:20.941 }, 00:20:20.941 { 00:20:20.941 "name": "BaseBdev3", 00:20:20.941 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:20.941 "is_configured": true, 00:20:20.941 "data_offset": 0, 00:20:20.941 "data_size": 65536 00:20:20.941 }, 00:20:20.941 { 00:20:20.941 "name": "BaseBdev4", 00:20:20.941 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:20.941 "is_configured": true, 00:20:20.941 "data_offset": 0, 00:20:20.941 "data_size": 65536 00:20:20.941 } 00:20:20.941 ] 00:20:20.941 }' 00:20:20.941 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:20.941 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:20.941 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:20.941 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:20.941 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:21.198 [2024-07-15 13:43:08.563658] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:21.198 [2024-07-15 13:43:08.567876] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf27f70 00:20:21.198 [2024-07-15 13:43:08.569024] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:21.198 13:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:22.128 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:22.128 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:22.128 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:22.128 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:22.128 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:22.128 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.128 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:22.385 "name": "raid_bdev1", 00:20:22.385 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:22.385 "strip_size_kb": 0, 00:20:22.385 "state": "online", 00:20:22.385 "raid_level": "raid1", 00:20:22.385 "superblock": false, 00:20:22.385 "num_base_bdevs": 4, 00:20:22.385 "num_base_bdevs_discovered": 4, 00:20:22.385 "num_base_bdevs_operational": 4, 00:20:22.385 "process": { 00:20:22.385 "type": "rebuild", 00:20:22.385 "target": "spare", 00:20:22.385 "progress": { 00:20:22.385 "blocks": 22528, 00:20:22.385 "percent": 34 00:20:22.385 } 00:20:22.385 }, 00:20:22.385 "base_bdevs_list": [ 00:20:22.385 { 00:20:22.385 "name": "spare", 00:20:22.385 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:22.385 "is_configured": true, 00:20:22.385 "data_offset": 0, 00:20:22.385 "data_size": 65536 00:20:22.385 }, 00:20:22.385 { 00:20:22.385 "name": "BaseBdev2", 00:20:22.385 "uuid": "c6931f2e-5531-55a4-b06e-eac6b4e71fba", 00:20:22.385 "is_configured": true, 00:20:22.385 "data_offset": 0, 00:20:22.385 "data_size": 65536 00:20:22.385 }, 00:20:22.385 { 00:20:22.385 "name": "BaseBdev3", 00:20:22.385 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:22.385 "is_configured": true, 00:20:22.385 "data_offset": 0, 00:20:22.385 "data_size": 65536 00:20:22.385 }, 00:20:22.385 { 00:20:22.385 "name": "BaseBdev4", 00:20:22.385 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:22.385 "is_configured": true, 00:20:22.385 "data_offset": 0, 00:20:22.385 "data_size": 65536 00:20:22.385 } 00:20:22.385 ] 00:20:22.385 }' 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:22.385 13:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:22.385 [2024-07-15 13:43:09.996190] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:22.642 [2024-07-15 13:43:10.080129] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xf27f70 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.642 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:22.901 "name": "raid_bdev1", 00:20:22.901 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:22.901 "strip_size_kb": 0, 00:20:22.901 "state": "online", 00:20:22.901 "raid_level": "raid1", 00:20:22.901 "superblock": false, 00:20:22.901 "num_base_bdevs": 4, 00:20:22.901 "num_base_bdevs_discovered": 3, 00:20:22.901 "num_base_bdevs_operational": 3, 00:20:22.901 "process": { 00:20:22.901 "type": "rebuild", 00:20:22.901 "target": "spare", 00:20:22.901 "progress": { 00:20:22.901 "blocks": 32768, 00:20:22.901 "percent": 50 00:20:22.901 } 00:20:22.901 }, 00:20:22.901 "base_bdevs_list": [ 00:20:22.901 { 00:20:22.901 "name": "spare", 00:20:22.901 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:22.901 "is_configured": true, 00:20:22.901 "data_offset": 0, 00:20:22.901 "data_size": 65536 00:20:22.901 }, 00:20:22.901 { 00:20:22.901 "name": null, 00:20:22.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.901 "is_configured": false, 00:20:22.901 "data_offset": 0, 00:20:22.901 "data_size": 65536 00:20:22.901 }, 00:20:22.901 { 00:20:22.901 "name": "BaseBdev3", 00:20:22.901 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:22.901 "is_configured": true, 00:20:22.901 "data_offset": 0, 00:20:22.901 "data_size": 65536 00:20:22.901 }, 00:20:22.901 { 00:20:22.901 "name": "BaseBdev4", 00:20:22.901 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:22.901 "is_configured": true, 00:20:22.901 "data_offset": 0, 00:20:22.901 "data_size": 65536 00:20:22.901 } 00:20:22.901 ] 00:20:22.901 }' 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=690 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.901 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:23.193 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:23.193 "name": "raid_bdev1", 00:20:23.193 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:23.193 "strip_size_kb": 0, 00:20:23.193 "state": "online", 00:20:23.193 "raid_level": "raid1", 00:20:23.193 "superblock": false, 00:20:23.193 "num_base_bdevs": 4, 00:20:23.193 "num_base_bdevs_discovered": 3, 00:20:23.193 "num_base_bdevs_operational": 3, 00:20:23.193 "process": { 00:20:23.193 "type": "rebuild", 00:20:23.193 "target": "spare", 00:20:23.193 "progress": { 00:20:23.193 "blocks": 38912, 00:20:23.193 "percent": 59 00:20:23.193 } 00:20:23.193 }, 00:20:23.193 "base_bdevs_list": [ 00:20:23.193 { 00:20:23.193 "name": "spare", 00:20:23.193 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:23.193 "is_configured": true, 00:20:23.193 "data_offset": 0, 00:20:23.193 "data_size": 65536 00:20:23.193 }, 00:20:23.193 { 00:20:23.193 "name": null, 00:20:23.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:23.193 "is_configured": false, 00:20:23.193 "data_offset": 0, 00:20:23.193 "data_size": 65536 00:20:23.193 }, 00:20:23.193 { 00:20:23.193 "name": "BaseBdev3", 00:20:23.193 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:23.193 "is_configured": true, 00:20:23.193 "data_offset": 0, 00:20:23.193 "data_size": 65536 00:20:23.193 }, 00:20:23.193 { 00:20:23.193 "name": "BaseBdev4", 00:20:23.193 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:23.193 "is_configured": true, 00:20:23.193 "data_offset": 0, 00:20:23.193 "data_size": 65536 00:20:23.193 } 00:20:23.193 ] 00:20:23.193 }' 00:20:23.193 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:23.193 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:23.193 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:23.193 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:23.193 13:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.126 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.385 [2024-07-15 13:43:11.792872] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:24.385 [2024-07-15 13:43:11.792917] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:24.385 [2024-07-15 13:43:11.792943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:24.385 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:24.385 "name": "raid_bdev1", 00:20:24.385 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:24.385 "strip_size_kb": 0, 00:20:24.385 "state": "online", 00:20:24.385 "raid_level": "raid1", 00:20:24.385 "superblock": false, 00:20:24.385 "num_base_bdevs": 4, 00:20:24.385 "num_base_bdevs_discovered": 3, 00:20:24.385 "num_base_bdevs_operational": 3, 00:20:24.385 "process": { 00:20:24.385 "type": "rebuild", 00:20:24.385 "target": "spare", 00:20:24.385 "progress": { 00:20:24.385 "blocks": 63488, 00:20:24.385 "percent": 96 00:20:24.385 } 00:20:24.385 }, 00:20:24.385 "base_bdevs_list": [ 00:20:24.385 { 00:20:24.385 "name": "spare", 00:20:24.385 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:24.385 "is_configured": true, 00:20:24.385 "data_offset": 0, 00:20:24.385 "data_size": 65536 00:20:24.385 }, 00:20:24.385 { 00:20:24.385 "name": null, 00:20:24.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.385 "is_configured": false, 00:20:24.385 "data_offset": 0, 00:20:24.385 "data_size": 65536 00:20:24.385 }, 00:20:24.385 { 00:20:24.385 "name": "BaseBdev3", 00:20:24.385 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:24.385 "is_configured": true, 00:20:24.385 "data_offset": 0, 00:20:24.385 "data_size": 65536 00:20:24.385 }, 00:20:24.385 { 00:20:24.385 "name": "BaseBdev4", 00:20:24.385 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:24.385 "is_configured": true, 00:20:24.385 "data_offset": 0, 00:20:24.385 "data_size": 65536 00:20:24.385 } 00:20:24.385 ] 00:20:24.385 }' 00:20:24.385 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:24.385 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:24.385 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:24.385 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:24.385 13:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.321 13:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:25.580 "name": "raid_bdev1", 00:20:25.580 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:25.580 "strip_size_kb": 0, 00:20:25.580 "state": "online", 00:20:25.580 "raid_level": "raid1", 00:20:25.580 "superblock": false, 00:20:25.580 "num_base_bdevs": 4, 00:20:25.580 "num_base_bdevs_discovered": 3, 00:20:25.580 "num_base_bdevs_operational": 3, 00:20:25.580 "base_bdevs_list": [ 00:20:25.580 { 00:20:25.580 "name": "spare", 00:20:25.580 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:25.580 "is_configured": true, 00:20:25.580 "data_offset": 0, 00:20:25.580 "data_size": 65536 00:20:25.580 }, 00:20:25.580 { 00:20:25.580 "name": null, 00:20:25.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.580 "is_configured": false, 00:20:25.580 "data_offset": 0, 00:20:25.580 "data_size": 65536 00:20:25.580 }, 00:20:25.580 { 00:20:25.580 "name": "BaseBdev3", 00:20:25.580 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:25.580 "is_configured": true, 00:20:25.580 "data_offset": 0, 00:20:25.580 "data_size": 65536 00:20:25.580 }, 00:20:25.580 { 00:20:25.580 "name": "BaseBdev4", 00:20:25.580 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:25.580 "is_configured": true, 00:20:25.580 "data_offset": 0, 00:20:25.580 "data_size": 65536 00:20:25.580 } 00:20:25.580 ] 00:20:25.580 }' 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.580 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:25.838 "name": "raid_bdev1", 00:20:25.838 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:25.838 "strip_size_kb": 0, 00:20:25.838 "state": "online", 00:20:25.838 "raid_level": "raid1", 00:20:25.838 "superblock": false, 00:20:25.838 "num_base_bdevs": 4, 00:20:25.838 "num_base_bdevs_discovered": 3, 00:20:25.838 "num_base_bdevs_operational": 3, 00:20:25.838 "base_bdevs_list": [ 00:20:25.838 { 00:20:25.838 "name": "spare", 00:20:25.838 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:25.838 "is_configured": true, 00:20:25.838 "data_offset": 0, 00:20:25.838 "data_size": 65536 00:20:25.838 }, 00:20:25.838 { 00:20:25.838 "name": null, 00:20:25.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.838 "is_configured": false, 00:20:25.838 "data_offset": 0, 00:20:25.838 "data_size": 65536 00:20:25.838 }, 00:20:25.838 { 00:20:25.838 "name": "BaseBdev3", 00:20:25.838 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:25.838 "is_configured": true, 00:20:25.838 "data_offset": 0, 00:20:25.838 "data_size": 65536 00:20:25.838 }, 00:20:25.838 { 00:20:25.838 "name": "BaseBdev4", 00:20:25.838 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:25.838 "is_configured": true, 00:20:25.838 "data_offset": 0, 00:20:25.838 "data_size": 65536 00:20:25.838 } 00:20:25.838 ] 00:20:25.838 }' 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.838 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.097 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.097 "name": "raid_bdev1", 00:20:26.097 "uuid": "6fec8c8e-82f0-4f0c-a11b-f844561e21a4", 00:20:26.097 "strip_size_kb": 0, 00:20:26.097 "state": "online", 00:20:26.097 "raid_level": "raid1", 00:20:26.097 "superblock": false, 00:20:26.097 "num_base_bdevs": 4, 00:20:26.097 "num_base_bdevs_discovered": 3, 00:20:26.097 "num_base_bdevs_operational": 3, 00:20:26.097 "base_bdevs_list": [ 00:20:26.097 { 00:20:26.097 "name": "spare", 00:20:26.097 "uuid": "2d609219-99d3-50a8-b628-79393303e87f", 00:20:26.097 "is_configured": true, 00:20:26.097 "data_offset": 0, 00:20:26.097 "data_size": 65536 00:20:26.097 }, 00:20:26.097 { 00:20:26.097 "name": null, 00:20:26.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.097 "is_configured": false, 00:20:26.097 "data_offset": 0, 00:20:26.097 "data_size": 65536 00:20:26.097 }, 00:20:26.097 { 00:20:26.097 "name": "BaseBdev3", 00:20:26.097 "uuid": "65580b10-0630-5302-b7d7-0f2459625dee", 00:20:26.097 "is_configured": true, 00:20:26.097 "data_offset": 0, 00:20:26.097 "data_size": 65536 00:20:26.097 }, 00:20:26.097 { 00:20:26.097 "name": "BaseBdev4", 00:20:26.097 "uuid": "415c523c-92df-5354-951f-95bd553305ab", 00:20:26.097 "is_configured": true, 00:20:26.097 "data_offset": 0, 00:20:26.097 "data_size": 65536 00:20:26.097 } 00:20:26.097 ] 00:20:26.097 }' 00:20:26.097 13:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.097 13:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.663 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:26.663 [2024-07-15 13:43:14.231115] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:26.663 [2024-07-15 13:43:14.231140] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:26.663 [2024-07-15 13:43:14.231181] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:26.663 [2024-07-15 13:43:14.231229] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:26.663 [2024-07-15 13:43:14.231237] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf22160 name raid_bdev1, state offline 00:20:26.663 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:20:26.663 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:26.923 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:27.181 /dev/nbd0 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:27.181 1+0 records in 00:20:27.181 1+0 records out 00:20:27.181 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271345 s, 15.1 MB/s 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:27.181 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:27.182 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:27.182 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:27.182 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:27.182 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:27.182 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:27.182 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:27.440 /dev/nbd1 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:27.441 1+0 records in 00:20:27.441 1+0 records out 00:20:27.441 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023095 s, 17.7 MB/s 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:27.441 13:43:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:27.699 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 71351 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 71351 ']' 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 71351 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 71351 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 71351' 00:20:27.958 killing process with pid 71351 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 71351 00:20:27.958 Received shutdown signal, test time was about 60.000000 seconds 00:20:27.958 00:20:27.958 Latency(us) 00:20:27.958 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:27.958 =================================================================================================================== 00:20:27.958 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:27.958 [2024-07-15 13:43:15.370368] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:27.958 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 71351 00:20:27.958 [2024-07-15 13:43:15.414978] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:20:28.217 00:20:28.217 real 0m20.486s 00:20:28.217 user 0m27.067s 00:20:28.217 sys 0m4.183s 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.217 ************************************ 00:20:28.217 END TEST raid_rebuild_test 00:20:28.217 ************************************ 00:20:28.217 13:43:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:28.217 13:43:15 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:20:28.217 13:43:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:28.217 13:43:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:28.217 13:43:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:28.217 ************************************ 00:20:28.217 START TEST raid_rebuild_test_sb 00:20:28.217 ************************************ 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=74248 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 74248 /var/tmp/spdk-raid.sock 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:28.217 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 74248 ']' 00:20:28.218 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:28.218 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:28.218 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:28.218 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:28.218 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:28.218 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:28.218 [2024-07-15 13:43:15.749626] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:20:28.218 [2024-07-15 13:43:15.749678] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74248 ] 00:20:28.218 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:28.218 Zero copy mechanism will not be used. 00:20:28.476 [2024-07-15 13:43:15.836255] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:28.476 [2024-07-15 13:43:15.927023] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:28.476 [2024-07-15 13:43:15.978797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:28.476 [2024-07-15 13:43:15.978824] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:29.042 13:43:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:29.042 13:43:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:29.042 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:29.042 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:29.300 BaseBdev1_malloc 00:20:29.300 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:29.300 [2024-07-15 13:43:16.869912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:29.300 [2024-07-15 13:43:16.869953] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.300 [2024-07-15 13:43:16.869971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c9600 00:20:29.300 [2024-07-15 13:43:16.869979] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.300 [2024-07-15 13:43:16.871226] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.300 [2024-07-15 13:43:16.871252] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:29.300 BaseBdev1 00:20:29.300 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:29.300 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:29.558 BaseBdev2_malloc 00:20:29.559 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:29.817 [2024-07-15 13:43:17.227796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:29.817 [2024-07-15 13:43:17.227834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.817 [2024-07-15 13:43:17.227851] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ca120 00:20:29.817 [2024-07-15 13:43:17.227859] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.817 [2024-07-15 13:43:17.229055] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.817 [2024-07-15 13:43:17.229085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:29.817 BaseBdev2 00:20:29.817 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:29.817 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:29.817 BaseBdev3_malloc 00:20:29.817 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:30.075 [2024-07-15 13:43:17.556324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:30.075 [2024-07-15 13:43:17.556361] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.075 [2024-07-15 13:43:17.556375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16771b0 00:20:30.075 [2024-07-15 13:43:17.556383] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.075 [2024-07-15 13:43:17.557451] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.075 [2024-07-15 13:43:17.557475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:30.075 BaseBdev3 00:20:30.075 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:30.075 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:30.334 BaseBdev4_malloc 00:20:30.334 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:30.334 [2024-07-15 13:43:17.884782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:30.334 [2024-07-15 13:43:17.884819] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.334 [2024-07-15 13:43:17.884834] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1676390 00:20:30.334 [2024-07-15 13:43:17.884843] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.334 [2024-07-15 13:43:17.885964] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.334 [2024-07-15 13:43:17.885990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:30.334 BaseBdev4 00:20:30.334 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:30.593 spare_malloc 00:20:30.593 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:30.851 spare_delay 00:20:30.851 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:30.851 [2024-07-15 13:43:18.399065] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:30.851 [2024-07-15 13:43:18.399101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.851 [2024-07-15 13:43:18.399116] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x167ae70 00:20:30.851 [2024-07-15 13:43:18.399124] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.851 [2024-07-15 13:43:18.400296] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.851 [2024-07-15 13:43:18.400319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:30.851 spare 00:20:30.851 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:31.118 [2024-07-15 13:43:18.571539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:31.118 [2024-07-15 13:43:18.572541] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:31.118 [2024-07-15 13:43:18.572582] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:31.118 [2024-07-15 13:43:18.572612] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:31.118 [2024-07-15 13:43:18.572751] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15fa160 00:20:31.118 [2024-07-15 13:43:18.572759] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:31.118 [2024-07-15 13:43:18.572901] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16746d0 00:20:31.118 [2024-07-15 13:43:18.573017] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15fa160 00:20:31.118 [2024-07-15 13:43:18.573024] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15fa160 00:20:31.118 [2024-07-15 13:43:18.573109] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.118 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.376 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.376 "name": "raid_bdev1", 00:20:31.376 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:31.376 "strip_size_kb": 0, 00:20:31.376 "state": "online", 00:20:31.376 "raid_level": "raid1", 00:20:31.376 "superblock": true, 00:20:31.376 "num_base_bdevs": 4, 00:20:31.376 "num_base_bdevs_discovered": 4, 00:20:31.376 "num_base_bdevs_operational": 4, 00:20:31.376 "base_bdevs_list": [ 00:20:31.376 { 00:20:31.376 "name": "BaseBdev1", 00:20:31.376 "uuid": "a4889930-1b5c-509e-a109-1d368bb51966", 00:20:31.376 "is_configured": true, 00:20:31.376 "data_offset": 2048, 00:20:31.376 "data_size": 63488 00:20:31.376 }, 00:20:31.376 { 00:20:31.376 "name": "BaseBdev2", 00:20:31.376 "uuid": "0a56d92e-b1c7-55dc-aa97-17f5f1f0acbd", 00:20:31.376 "is_configured": true, 00:20:31.376 "data_offset": 2048, 00:20:31.376 "data_size": 63488 00:20:31.376 }, 00:20:31.376 { 00:20:31.376 "name": "BaseBdev3", 00:20:31.376 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:31.376 "is_configured": true, 00:20:31.376 "data_offset": 2048, 00:20:31.376 "data_size": 63488 00:20:31.376 }, 00:20:31.376 { 00:20:31.376 "name": "BaseBdev4", 00:20:31.376 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:31.376 "is_configured": true, 00:20:31.376 "data_offset": 2048, 00:20:31.376 "data_size": 63488 00:20:31.376 } 00:20:31.376 ] 00:20:31.376 }' 00:20:31.376 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.376 13:43:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:31.941 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:31.941 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:31.941 [2024-07-15 13:43:19.433934] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:31.941 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:31.941 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.941 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:32.199 [2024-07-15 13:43:19.790700] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16746d0 00:20:32.199 /dev/nbd0 00:20:32.199 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:32.457 1+0 records in 00:20:32.457 1+0 records out 00:20:32.457 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024842 s, 16.5 MB/s 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:32.457 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:20:37.717 63488+0 records in 00:20:37.717 63488+0 records out 00:20:37.717 32505856 bytes (33 MB, 31 MiB) copied, 5.13182 s, 6.3 MB/s 00:20:37.717 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:37.717 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:37.717 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:37.717 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:37.717 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:37.717 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:37.717 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:37.717 [2024-07-15 13:43:25.163620] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:37.717 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:37.717 [2024-07-15 13:43:25.324043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.975 "name": "raid_bdev1", 00:20:37.975 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:37.975 "strip_size_kb": 0, 00:20:37.975 "state": "online", 00:20:37.975 "raid_level": "raid1", 00:20:37.975 "superblock": true, 00:20:37.975 "num_base_bdevs": 4, 00:20:37.975 "num_base_bdevs_discovered": 3, 00:20:37.975 "num_base_bdevs_operational": 3, 00:20:37.975 "base_bdevs_list": [ 00:20:37.975 { 00:20:37.975 "name": null, 00:20:37.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.975 "is_configured": false, 00:20:37.975 "data_offset": 2048, 00:20:37.975 "data_size": 63488 00:20:37.975 }, 00:20:37.975 { 00:20:37.975 "name": "BaseBdev2", 00:20:37.975 "uuid": "0a56d92e-b1c7-55dc-aa97-17f5f1f0acbd", 00:20:37.975 "is_configured": true, 00:20:37.975 "data_offset": 2048, 00:20:37.975 "data_size": 63488 00:20:37.975 }, 00:20:37.975 { 00:20:37.975 "name": "BaseBdev3", 00:20:37.975 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:37.975 "is_configured": true, 00:20:37.975 "data_offset": 2048, 00:20:37.975 "data_size": 63488 00:20:37.975 }, 00:20:37.975 { 00:20:37.975 "name": "BaseBdev4", 00:20:37.975 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:37.975 "is_configured": true, 00:20:37.975 "data_offset": 2048, 00:20:37.975 "data_size": 63488 00:20:37.975 } 00:20:37.975 ] 00:20:37.975 }' 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.975 13:43:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.541 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:38.541 [2024-07-15 13:43:26.146175] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:38.541 [2024-07-15 13:43:26.149849] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16746d0 00:20:38.541 [2024-07-15 13:43:26.151555] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:38.799 13:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.733 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:39.733 "name": "raid_bdev1", 00:20:39.733 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:39.733 "strip_size_kb": 0, 00:20:39.733 "state": "online", 00:20:39.733 "raid_level": "raid1", 00:20:39.733 "superblock": true, 00:20:39.733 "num_base_bdevs": 4, 00:20:39.733 "num_base_bdevs_discovered": 4, 00:20:39.733 "num_base_bdevs_operational": 4, 00:20:39.733 "process": { 00:20:39.733 "type": "rebuild", 00:20:39.733 "target": "spare", 00:20:39.733 "progress": { 00:20:39.733 "blocks": 22528, 00:20:39.733 "percent": 35 00:20:39.733 } 00:20:39.733 }, 00:20:39.733 "base_bdevs_list": [ 00:20:39.733 { 00:20:39.733 "name": "spare", 00:20:39.733 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:39.733 "is_configured": true, 00:20:39.733 "data_offset": 2048, 00:20:39.733 "data_size": 63488 00:20:39.733 }, 00:20:39.733 { 00:20:39.733 "name": "BaseBdev2", 00:20:39.733 "uuid": "0a56d92e-b1c7-55dc-aa97-17f5f1f0acbd", 00:20:39.733 "is_configured": true, 00:20:39.733 "data_offset": 2048, 00:20:39.733 "data_size": 63488 00:20:39.733 }, 00:20:39.733 { 00:20:39.733 "name": "BaseBdev3", 00:20:39.733 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:39.733 "is_configured": true, 00:20:39.733 "data_offset": 2048, 00:20:39.733 "data_size": 63488 00:20:39.733 }, 00:20:39.733 { 00:20:39.733 "name": "BaseBdev4", 00:20:39.733 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:39.733 "is_configured": true, 00:20:39.733 "data_offset": 2048, 00:20:39.733 "data_size": 63488 00:20:39.733 } 00:20:39.733 ] 00:20:39.733 }' 00:20:39.990 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:39.990 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:39.990 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:39.990 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:39.990 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:39.990 [2024-07-15 13:43:27.575874] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:40.249 [2024-07-15 13:43:27.662703] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:40.249 [2024-07-15 13:43:27.662737] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:40.249 [2024-07-15 13:43:27.662748] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:40.249 [2024-07-15 13:43:27.662758] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.249 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.507 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.507 "name": "raid_bdev1", 00:20:40.507 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:40.507 "strip_size_kb": 0, 00:20:40.507 "state": "online", 00:20:40.507 "raid_level": "raid1", 00:20:40.507 "superblock": true, 00:20:40.507 "num_base_bdevs": 4, 00:20:40.507 "num_base_bdevs_discovered": 3, 00:20:40.507 "num_base_bdevs_operational": 3, 00:20:40.507 "base_bdevs_list": [ 00:20:40.507 { 00:20:40.507 "name": null, 00:20:40.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.507 "is_configured": false, 00:20:40.507 "data_offset": 2048, 00:20:40.507 "data_size": 63488 00:20:40.508 }, 00:20:40.508 { 00:20:40.508 "name": "BaseBdev2", 00:20:40.508 "uuid": "0a56d92e-b1c7-55dc-aa97-17f5f1f0acbd", 00:20:40.508 "is_configured": true, 00:20:40.508 "data_offset": 2048, 00:20:40.508 "data_size": 63488 00:20:40.508 }, 00:20:40.508 { 00:20:40.508 "name": "BaseBdev3", 00:20:40.508 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:40.508 "is_configured": true, 00:20:40.508 "data_offset": 2048, 00:20:40.508 "data_size": 63488 00:20:40.508 }, 00:20:40.508 { 00:20:40.508 "name": "BaseBdev4", 00:20:40.508 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:40.508 "is_configured": true, 00:20:40.508 "data_offset": 2048, 00:20:40.508 "data_size": 63488 00:20:40.508 } 00:20:40.508 ] 00:20:40.508 }' 00:20:40.508 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.508 13:43:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:40.765 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:40.765 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:40.765 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:40.765 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:40.765 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:40.765 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.765 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.022 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:41.022 "name": "raid_bdev1", 00:20:41.022 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:41.022 "strip_size_kb": 0, 00:20:41.022 "state": "online", 00:20:41.022 "raid_level": "raid1", 00:20:41.022 "superblock": true, 00:20:41.022 "num_base_bdevs": 4, 00:20:41.022 "num_base_bdevs_discovered": 3, 00:20:41.022 "num_base_bdevs_operational": 3, 00:20:41.022 "base_bdevs_list": [ 00:20:41.022 { 00:20:41.022 "name": null, 00:20:41.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.022 "is_configured": false, 00:20:41.022 "data_offset": 2048, 00:20:41.022 "data_size": 63488 00:20:41.022 }, 00:20:41.022 { 00:20:41.022 "name": "BaseBdev2", 00:20:41.022 "uuid": "0a56d92e-b1c7-55dc-aa97-17f5f1f0acbd", 00:20:41.022 "is_configured": true, 00:20:41.022 "data_offset": 2048, 00:20:41.022 "data_size": 63488 00:20:41.022 }, 00:20:41.022 { 00:20:41.022 "name": "BaseBdev3", 00:20:41.022 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:41.022 "is_configured": true, 00:20:41.022 "data_offset": 2048, 00:20:41.022 "data_size": 63488 00:20:41.022 }, 00:20:41.022 { 00:20:41.022 "name": "BaseBdev4", 00:20:41.022 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:41.022 "is_configured": true, 00:20:41.022 "data_offset": 2048, 00:20:41.022 "data_size": 63488 00:20:41.022 } 00:20:41.022 ] 00:20:41.022 }' 00:20:41.022 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:41.022 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:41.022 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:41.280 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:41.280 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:41.280 [2024-07-15 13:43:28.805253] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:41.280 [2024-07-15 13:43:28.809492] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1666020 00:20:41.280 [2024-07-15 13:43:28.810635] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:41.280 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:42.212 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:42.212 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:42.212 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:42.212 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:42.212 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:42.212 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.212 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.470 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:42.470 "name": "raid_bdev1", 00:20:42.470 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:42.470 "strip_size_kb": 0, 00:20:42.470 "state": "online", 00:20:42.470 "raid_level": "raid1", 00:20:42.470 "superblock": true, 00:20:42.470 "num_base_bdevs": 4, 00:20:42.470 "num_base_bdevs_discovered": 4, 00:20:42.470 "num_base_bdevs_operational": 4, 00:20:42.470 "process": { 00:20:42.470 "type": "rebuild", 00:20:42.470 "target": "spare", 00:20:42.470 "progress": { 00:20:42.470 "blocks": 22528, 00:20:42.470 "percent": 35 00:20:42.470 } 00:20:42.470 }, 00:20:42.470 "base_bdevs_list": [ 00:20:42.470 { 00:20:42.470 "name": "spare", 00:20:42.470 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:42.470 "is_configured": true, 00:20:42.470 "data_offset": 2048, 00:20:42.470 "data_size": 63488 00:20:42.470 }, 00:20:42.470 { 00:20:42.470 "name": "BaseBdev2", 00:20:42.470 "uuid": "0a56d92e-b1c7-55dc-aa97-17f5f1f0acbd", 00:20:42.470 "is_configured": true, 00:20:42.470 "data_offset": 2048, 00:20:42.470 "data_size": 63488 00:20:42.470 }, 00:20:42.470 { 00:20:42.470 "name": "BaseBdev3", 00:20:42.470 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:42.470 "is_configured": true, 00:20:42.470 "data_offset": 2048, 00:20:42.470 "data_size": 63488 00:20:42.470 }, 00:20:42.470 { 00:20:42.470 "name": "BaseBdev4", 00:20:42.470 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:42.470 "is_configured": true, 00:20:42.470 "data_offset": 2048, 00:20:42.470 "data_size": 63488 00:20:42.470 } 00:20:42.470 ] 00:20:42.470 }' 00:20:42.470 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:42.470 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:42.470 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:42.727 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:42.727 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:42.727 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:42.727 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:42.727 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:42.727 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:42.727 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:42.727 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:42.727 [2024-07-15 13:43:30.241866] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:42.985 [2024-07-15 13:43:30.421834] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1666020 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.985 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:43.243 "name": "raid_bdev1", 00:20:43.243 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:43.243 "strip_size_kb": 0, 00:20:43.243 "state": "online", 00:20:43.243 "raid_level": "raid1", 00:20:43.243 "superblock": true, 00:20:43.243 "num_base_bdevs": 4, 00:20:43.243 "num_base_bdevs_discovered": 3, 00:20:43.243 "num_base_bdevs_operational": 3, 00:20:43.243 "process": { 00:20:43.243 "type": "rebuild", 00:20:43.243 "target": "spare", 00:20:43.243 "progress": { 00:20:43.243 "blocks": 32768, 00:20:43.243 "percent": 51 00:20:43.243 } 00:20:43.243 }, 00:20:43.243 "base_bdevs_list": [ 00:20:43.243 { 00:20:43.243 "name": "spare", 00:20:43.243 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:43.243 "is_configured": true, 00:20:43.243 "data_offset": 2048, 00:20:43.243 "data_size": 63488 00:20:43.243 }, 00:20:43.243 { 00:20:43.243 "name": null, 00:20:43.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.243 "is_configured": false, 00:20:43.243 "data_offset": 2048, 00:20:43.243 "data_size": 63488 00:20:43.243 }, 00:20:43.243 { 00:20:43.243 "name": "BaseBdev3", 00:20:43.243 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:43.243 "is_configured": true, 00:20:43.243 "data_offset": 2048, 00:20:43.243 "data_size": 63488 00:20:43.243 }, 00:20:43.243 { 00:20:43.243 "name": "BaseBdev4", 00:20:43.243 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:43.243 "is_configured": true, 00:20:43.243 "data_offset": 2048, 00:20:43.243 "data_size": 63488 00:20:43.243 } 00:20:43.243 ] 00:20:43.243 }' 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=710 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.243 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.500 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:43.500 "name": "raid_bdev1", 00:20:43.500 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:43.500 "strip_size_kb": 0, 00:20:43.500 "state": "online", 00:20:43.500 "raid_level": "raid1", 00:20:43.500 "superblock": true, 00:20:43.500 "num_base_bdevs": 4, 00:20:43.500 "num_base_bdevs_discovered": 3, 00:20:43.500 "num_base_bdevs_operational": 3, 00:20:43.500 "process": { 00:20:43.500 "type": "rebuild", 00:20:43.500 "target": "spare", 00:20:43.500 "progress": { 00:20:43.500 "blocks": 38912, 00:20:43.500 "percent": 61 00:20:43.500 } 00:20:43.500 }, 00:20:43.500 "base_bdevs_list": [ 00:20:43.500 { 00:20:43.500 "name": "spare", 00:20:43.500 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:43.500 "is_configured": true, 00:20:43.500 "data_offset": 2048, 00:20:43.500 "data_size": 63488 00:20:43.500 }, 00:20:43.500 { 00:20:43.500 "name": null, 00:20:43.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.500 "is_configured": false, 00:20:43.500 "data_offset": 2048, 00:20:43.500 "data_size": 63488 00:20:43.500 }, 00:20:43.500 { 00:20:43.500 "name": "BaseBdev3", 00:20:43.500 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:43.500 "is_configured": true, 00:20:43.500 "data_offset": 2048, 00:20:43.500 "data_size": 63488 00:20:43.500 }, 00:20:43.500 { 00:20:43.500 "name": "BaseBdev4", 00:20:43.500 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:43.500 "is_configured": true, 00:20:43.500 "data_offset": 2048, 00:20:43.500 "data_size": 63488 00:20:43.500 } 00:20:43.500 ] 00:20:43.500 }' 00:20:43.500 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:43.500 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:43.500 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.500 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:43.500 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.432 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.432 [2024-07-15 13:43:32.033738] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:44.432 [2024-07-15 13:43:32.033785] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:44.432 [2024-07-15 13:43:32.033869] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:44.689 "name": "raid_bdev1", 00:20:44.689 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:44.689 "strip_size_kb": 0, 00:20:44.689 "state": "online", 00:20:44.689 "raid_level": "raid1", 00:20:44.689 "superblock": true, 00:20:44.689 "num_base_bdevs": 4, 00:20:44.689 "num_base_bdevs_discovered": 3, 00:20:44.689 "num_base_bdevs_operational": 3, 00:20:44.689 "base_bdevs_list": [ 00:20:44.689 { 00:20:44.689 "name": "spare", 00:20:44.689 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:44.689 "is_configured": true, 00:20:44.689 "data_offset": 2048, 00:20:44.689 "data_size": 63488 00:20:44.689 }, 00:20:44.689 { 00:20:44.689 "name": null, 00:20:44.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.689 "is_configured": false, 00:20:44.689 "data_offset": 2048, 00:20:44.689 "data_size": 63488 00:20:44.689 }, 00:20:44.689 { 00:20:44.689 "name": "BaseBdev3", 00:20:44.689 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:44.689 "is_configured": true, 00:20:44.689 "data_offset": 2048, 00:20:44.689 "data_size": 63488 00:20:44.689 }, 00:20:44.689 { 00:20:44.689 "name": "BaseBdev4", 00:20:44.689 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:44.689 "is_configured": true, 00:20:44.689 "data_offset": 2048, 00:20:44.689 "data_size": 63488 00:20:44.689 } 00:20:44.689 ] 00:20:44.689 }' 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.689 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:44.947 "name": "raid_bdev1", 00:20:44.947 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:44.947 "strip_size_kb": 0, 00:20:44.947 "state": "online", 00:20:44.947 "raid_level": "raid1", 00:20:44.947 "superblock": true, 00:20:44.947 "num_base_bdevs": 4, 00:20:44.947 "num_base_bdevs_discovered": 3, 00:20:44.947 "num_base_bdevs_operational": 3, 00:20:44.947 "base_bdevs_list": [ 00:20:44.947 { 00:20:44.947 "name": "spare", 00:20:44.947 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:44.947 "is_configured": true, 00:20:44.947 "data_offset": 2048, 00:20:44.947 "data_size": 63488 00:20:44.947 }, 00:20:44.947 { 00:20:44.947 "name": null, 00:20:44.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.947 "is_configured": false, 00:20:44.947 "data_offset": 2048, 00:20:44.947 "data_size": 63488 00:20:44.947 }, 00:20:44.947 { 00:20:44.947 "name": "BaseBdev3", 00:20:44.947 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:44.947 "is_configured": true, 00:20:44.947 "data_offset": 2048, 00:20:44.947 "data_size": 63488 00:20:44.947 }, 00:20:44.947 { 00:20:44.947 "name": "BaseBdev4", 00:20:44.947 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:44.947 "is_configured": true, 00:20:44.947 "data_offset": 2048, 00:20:44.947 "data_size": 63488 00:20:44.947 } 00:20:44.947 ] 00:20:44.947 }' 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.947 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.205 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.205 "name": "raid_bdev1", 00:20:45.205 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:45.205 "strip_size_kb": 0, 00:20:45.205 "state": "online", 00:20:45.205 "raid_level": "raid1", 00:20:45.205 "superblock": true, 00:20:45.205 "num_base_bdevs": 4, 00:20:45.205 "num_base_bdevs_discovered": 3, 00:20:45.205 "num_base_bdevs_operational": 3, 00:20:45.205 "base_bdevs_list": [ 00:20:45.205 { 00:20:45.205 "name": "spare", 00:20:45.205 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:45.205 "is_configured": true, 00:20:45.205 "data_offset": 2048, 00:20:45.205 "data_size": 63488 00:20:45.205 }, 00:20:45.205 { 00:20:45.205 "name": null, 00:20:45.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.205 "is_configured": false, 00:20:45.205 "data_offset": 2048, 00:20:45.205 "data_size": 63488 00:20:45.205 }, 00:20:45.205 { 00:20:45.205 "name": "BaseBdev3", 00:20:45.205 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:45.205 "is_configured": true, 00:20:45.205 "data_offset": 2048, 00:20:45.205 "data_size": 63488 00:20:45.205 }, 00:20:45.205 { 00:20:45.205 "name": "BaseBdev4", 00:20:45.205 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:45.205 "is_configured": true, 00:20:45.205 "data_offset": 2048, 00:20:45.205 "data_size": 63488 00:20:45.205 } 00:20:45.205 ] 00:20:45.205 }' 00:20:45.205 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.205 13:43:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.775 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:45.775 [2024-07-15 13:43:33.277450] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:45.775 [2024-07-15 13:43:33.277473] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:45.775 [2024-07-15 13:43:33.277514] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:45.775 [2024-07-15 13:43:33.277561] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:45.775 [2024-07-15 13:43:33.277569] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15fa160 name raid_bdev1, state offline 00:20:45.775 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.775 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:46.038 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:46.038 /dev/nbd0 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:46.295 1+0 records in 00:20:46.295 1+0 records out 00:20:46.295 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277985 s, 14.7 MB/s 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:46.295 /dev/nbd1 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:46.295 1+0 records in 00:20:46.295 1+0 records out 00:20:46.295 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028282 s, 14.5 MB/s 00:20:46.295 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:46.296 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:46.296 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:46.296 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:46.296 13:43:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:46.296 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:46.296 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:46.296 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:46.552 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:46.552 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:46.552 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:46.552 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:46.552 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:46.552 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:46.552 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:46.552 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:46.808 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:47.065 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:47.065 [2024-07-15 13:43:34.667953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:47.065 [2024-07-15 13:43:34.667999] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.065 [2024-07-15 13:43:34.668015] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ff1f0 00:20:47.065 [2024-07-15 13:43:34.668023] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.065 [2024-07-15 13:43:34.669219] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.065 [2024-07-15 13:43:34.669243] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:47.065 [2024-07-15 13:43:34.669300] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:47.065 [2024-07-15 13:43:34.669321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:47.065 [2024-07-15 13:43:34.669396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:47.065 [2024-07-15 13:43:34.669444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:47.065 spare 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.322 [2024-07-15 13:43:34.769746] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1665d10 00:20:47.322 [2024-07-15 13:43:34.769763] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:47.322 [2024-07-15 13:43:34.769912] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ff800 00:20:47.322 [2024-07-15 13:43:34.770039] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1665d10 00:20:47.322 [2024-07-15 13:43:34.770046] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1665d10 00:20:47.322 [2024-07-15 13:43:34.770124] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:47.322 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.322 "name": "raid_bdev1", 00:20:47.322 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:47.322 "strip_size_kb": 0, 00:20:47.322 "state": "online", 00:20:47.322 "raid_level": "raid1", 00:20:47.322 "superblock": true, 00:20:47.322 "num_base_bdevs": 4, 00:20:47.322 "num_base_bdevs_discovered": 3, 00:20:47.322 "num_base_bdevs_operational": 3, 00:20:47.322 "base_bdevs_list": [ 00:20:47.322 { 00:20:47.322 "name": "spare", 00:20:47.322 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:47.322 "is_configured": true, 00:20:47.322 "data_offset": 2048, 00:20:47.322 "data_size": 63488 00:20:47.322 }, 00:20:47.322 { 00:20:47.322 "name": null, 00:20:47.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.323 "is_configured": false, 00:20:47.323 "data_offset": 2048, 00:20:47.323 "data_size": 63488 00:20:47.323 }, 00:20:47.323 { 00:20:47.323 "name": "BaseBdev3", 00:20:47.323 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:47.323 "is_configured": true, 00:20:47.323 "data_offset": 2048, 00:20:47.323 "data_size": 63488 00:20:47.323 }, 00:20:47.323 { 00:20:47.323 "name": "BaseBdev4", 00:20:47.323 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:47.323 "is_configured": true, 00:20:47.323 "data_offset": 2048, 00:20:47.323 "data_size": 63488 00:20:47.323 } 00:20:47.323 ] 00:20:47.323 }' 00:20:47.323 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.323 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:47.909 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:47.909 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:47.909 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:47.909 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:47.909 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:47.909 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.909 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:48.166 "name": "raid_bdev1", 00:20:48.166 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:48.166 "strip_size_kb": 0, 00:20:48.166 "state": "online", 00:20:48.166 "raid_level": "raid1", 00:20:48.166 "superblock": true, 00:20:48.166 "num_base_bdevs": 4, 00:20:48.166 "num_base_bdevs_discovered": 3, 00:20:48.166 "num_base_bdevs_operational": 3, 00:20:48.166 "base_bdevs_list": [ 00:20:48.166 { 00:20:48.166 "name": "spare", 00:20:48.166 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:48.166 "is_configured": true, 00:20:48.166 "data_offset": 2048, 00:20:48.166 "data_size": 63488 00:20:48.166 }, 00:20:48.166 { 00:20:48.166 "name": null, 00:20:48.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.166 "is_configured": false, 00:20:48.166 "data_offset": 2048, 00:20:48.166 "data_size": 63488 00:20:48.166 }, 00:20:48.166 { 00:20:48.166 "name": "BaseBdev3", 00:20:48.166 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:48.166 "is_configured": true, 00:20:48.166 "data_offset": 2048, 00:20:48.166 "data_size": 63488 00:20:48.166 }, 00:20:48.166 { 00:20:48.166 "name": "BaseBdev4", 00:20:48.166 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:48.166 "is_configured": true, 00:20:48.166 "data_offset": 2048, 00:20:48.166 "data_size": 63488 00:20:48.166 } 00:20:48.166 ] 00:20:48.166 }' 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:48.166 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:48.423 [2024-07-15 13:43:35.939298] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.423 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.680 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.680 "name": "raid_bdev1", 00:20:48.680 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:48.680 "strip_size_kb": 0, 00:20:48.680 "state": "online", 00:20:48.680 "raid_level": "raid1", 00:20:48.680 "superblock": true, 00:20:48.680 "num_base_bdevs": 4, 00:20:48.680 "num_base_bdevs_discovered": 2, 00:20:48.680 "num_base_bdevs_operational": 2, 00:20:48.680 "base_bdevs_list": [ 00:20:48.680 { 00:20:48.680 "name": null, 00:20:48.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.680 "is_configured": false, 00:20:48.680 "data_offset": 2048, 00:20:48.680 "data_size": 63488 00:20:48.680 }, 00:20:48.680 { 00:20:48.680 "name": null, 00:20:48.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.680 "is_configured": false, 00:20:48.680 "data_offset": 2048, 00:20:48.680 "data_size": 63488 00:20:48.680 }, 00:20:48.680 { 00:20:48.680 "name": "BaseBdev3", 00:20:48.680 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:48.680 "is_configured": true, 00:20:48.680 "data_offset": 2048, 00:20:48.680 "data_size": 63488 00:20:48.680 }, 00:20:48.680 { 00:20:48.680 "name": "BaseBdev4", 00:20:48.680 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:48.680 "is_configured": true, 00:20:48.680 "data_offset": 2048, 00:20:48.680 "data_size": 63488 00:20:48.680 } 00:20:48.680 ] 00:20:48.680 }' 00:20:48.680 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.680 13:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:49.242 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:49.242 [2024-07-15 13:43:36.785473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:49.242 [2024-07-15 13:43:36.785591] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:49.242 [2024-07-15 13:43:36.785603] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:49.242 [2024-07-15 13:43:36.785624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:49.242 [2024-07-15 13:43:36.789195] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16003e0 00:20:49.242 [2024-07-15 13:43:36.790805] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:49.242 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.642 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:50.642 "name": "raid_bdev1", 00:20:50.642 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:50.642 "strip_size_kb": 0, 00:20:50.642 "state": "online", 00:20:50.642 "raid_level": "raid1", 00:20:50.642 "superblock": true, 00:20:50.642 "num_base_bdevs": 4, 00:20:50.642 "num_base_bdevs_discovered": 3, 00:20:50.642 "num_base_bdevs_operational": 3, 00:20:50.642 "process": { 00:20:50.642 "type": "rebuild", 00:20:50.642 "target": "spare", 00:20:50.642 "progress": { 00:20:50.642 "blocks": 22528, 00:20:50.642 "percent": 35 00:20:50.642 } 00:20:50.642 }, 00:20:50.642 "base_bdevs_list": [ 00:20:50.642 { 00:20:50.642 "name": "spare", 00:20:50.642 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:50.642 "is_configured": true, 00:20:50.642 "data_offset": 2048, 00:20:50.642 "data_size": 63488 00:20:50.642 }, 00:20:50.642 { 00:20:50.642 "name": null, 00:20:50.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.642 "is_configured": false, 00:20:50.642 "data_offset": 2048, 00:20:50.642 "data_size": 63488 00:20:50.642 }, 00:20:50.642 { 00:20:50.642 "name": "BaseBdev3", 00:20:50.642 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:50.642 "is_configured": true, 00:20:50.642 "data_offset": 2048, 00:20:50.642 "data_size": 63488 00:20:50.642 }, 00:20:50.642 { 00:20:50.642 "name": "BaseBdev4", 00:20:50.642 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:50.642 "is_configured": true, 00:20:50.642 "data_offset": 2048, 00:20:50.642 "data_size": 63488 00:20:50.642 } 00:20:50.642 ] 00:20:50.642 }' 00:20:50.642 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:50.642 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:50.642 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:50.642 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:50.642 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:50.642 [2024-07-15 13:43:38.239637] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:50.900 [2024-07-15 13:43:38.301908] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:50.900 [2024-07-15 13:43:38.301946] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:50.900 [2024-07-15 13:43:38.301958] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:50.900 [2024-07-15 13:43:38.301964] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.900 "name": "raid_bdev1", 00:20:50.900 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:50.900 "strip_size_kb": 0, 00:20:50.900 "state": "online", 00:20:50.900 "raid_level": "raid1", 00:20:50.900 "superblock": true, 00:20:50.900 "num_base_bdevs": 4, 00:20:50.900 "num_base_bdevs_discovered": 2, 00:20:50.900 "num_base_bdevs_operational": 2, 00:20:50.900 "base_bdevs_list": [ 00:20:50.900 { 00:20:50.900 "name": null, 00:20:50.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.900 "is_configured": false, 00:20:50.900 "data_offset": 2048, 00:20:50.900 "data_size": 63488 00:20:50.900 }, 00:20:50.900 { 00:20:50.900 "name": null, 00:20:50.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.900 "is_configured": false, 00:20:50.900 "data_offset": 2048, 00:20:50.900 "data_size": 63488 00:20:50.900 }, 00:20:50.900 { 00:20:50.900 "name": "BaseBdev3", 00:20:50.900 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:50.900 "is_configured": true, 00:20:50.900 "data_offset": 2048, 00:20:50.900 "data_size": 63488 00:20:50.900 }, 00:20:50.900 { 00:20:50.900 "name": "BaseBdev4", 00:20:50.900 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:50.900 "is_configured": true, 00:20:50.900 "data_offset": 2048, 00:20:50.900 "data_size": 63488 00:20:50.900 } 00:20:50.900 ] 00:20:50.900 }' 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.900 13:43:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:51.465 13:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:51.723 [2024-07-15 13:43:39.135977] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:51.723 [2024-07-15 13:43:39.136019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:51.723 [2024-07-15 13:43:39.136035] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1679a40 00:20:51.723 [2024-07-15 13:43:39.136043] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:51.723 [2024-07-15 13:43:39.136331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:51.723 [2024-07-15 13:43:39.136344] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:51.723 [2024-07-15 13:43:39.136401] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:51.723 [2024-07-15 13:43:39.136410] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:51.723 [2024-07-15 13:43:39.136416] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:51.723 [2024-07-15 13:43:39.136435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:51.723 [2024-07-15 13:43:39.140075] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11d0b40 00:20:51.723 [2024-07-15 13:43:39.141161] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:51.723 spare 00:20:51.723 13:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:52.654 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:52.654 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:52.654 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:52.654 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:52.654 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:52.654 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.654 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:52.911 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:52.911 "name": "raid_bdev1", 00:20:52.911 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:52.911 "strip_size_kb": 0, 00:20:52.911 "state": "online", 00:20:52.911 "raid_level": "raid1", 00:20:52.911 "superblock": true, 00:20:52.911 "num_base_bdevs": 4, 00:20:52.911 "num_base_bdevs_discovered": 3, 00:20:52.911 "num_base_bdevs_operational": 3, 00:20:52.911 "process": { 00:20:52.911 "type": "rebuild", 00:20:52.911 "target": "spare", 00:20:52.911 "progress": { 00:20:52.911 "blocks": 22528, 00:20:52.911 "percent": 35 00:20:52.911 } 00:20:52.911 }, 00:20:52.911 "base_bdevs_list": [ 00:20:52.911 { 00:20:52.911 "name": "spare", 00:20:52.911 "uuid": "2c6a022f-7130-5471-bcb0-3730b0d4fb47", 00:20:52.911 "is_configured": true, 00:20:52.911 "data_offset": 2048, 00:20:52.911 "data_size": 63488 00:20:52.911 }, 00:20:52.911 { 00:20:52.911 "name": null, 00:20:52.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.911 "is_configured": false, 00:20:52.911 "data_offset": 2048, 00:20:52.911 "data_size": 63488 00:20:52.911 }, 00:20:52.911 { 00:20:52.911 "name": "BaseBdev3", 00:20:52.911 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:52.911 "is_configured": true, 00:20:52.911 "data_offset": 2048, 00:20:52.911 "data_size": 63488 00:20:52.911 }, 00:20:52.911 { 00:20:52.911 "name": "BaseBdev4", 00:20:52.911 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:52.911 "is_configured": true, 00:20:52.911 "data_offset": 2048, 00:20:52.911 "data_size": 63488 00:20:52.911 } 00:20:52.911 ] 00:20:52.911 }' 00:20:52.911 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:52.911 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:52.911 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:52.911 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:52.911 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:53.169 [2024-07-15 13:43:40.592482] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:53.169 [2024-07-15 13:43:40.651957] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:53.169 [2024-07-15 13:43:40.651990] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:53.169 [2024-07-15 13:43:40.652006] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:53.169 [2024-07-15 13:43:40.652012] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.169 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.426 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.426 "name": "raid_bdev1", 00:20:53.426 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:53.426 "strip_size_kb": 0, 00:20:53.426 "state": "online", 00:20:53.426 "raid_level": "raid1", 00:20:53.426 "superblock": true, 00:20:53.426 "num_base_bdevs": 4, 00:20:53.426 "num_base_bdevs_discovered": 2, 00:20:53.426 "num_base_bdevs_operational": 2, 00:20:53.426 "base_bdevs_list": [ 00:20:53.426 { 00:20:53.426 "name": null, 00:20:53.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.426 "is_configured": false, 00:20:53.426 "data_offset": 2048, 00:20:53.426 "data_size": 63488 00:20:53.426 }, 00:20:53.426 { 00:20:53.426 "name": null, 00:20:53.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.426 "is_configured": false, 00:20:53.426 "data_offset": 2048, 00:20:53.426 "data_size": 63488 00:20:53.426 }, 00:20:53.426 { 00:20:53.426 "name": "BaseBdev3", 00:20:53.426 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:53.426 "is_configured": true, 00:20:53.426 "data_offset": 2048, 00:20:53.426 "data_size": 63488 00:20:53.426 }, 00:20:53.426 { 00:20:53.426 "name": "BaseBdev4", 00:20:53.426 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:53.426 "is_configured": true, 00:20:53.426 "data_offset": 2048, 00:20:53.426 "data_size": 63488 00:20:53.426 } 00:20:53.426 ] 00:20:53.426 }' 00:20:53.426 13:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.426 13:43:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:53.992 "name": "raid_bdev1", 00:20:53.992 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:53.992 "strip_size_kb": 0, 00:20:53.992 "state": "online", 00:20:53.992 "raid_level": "raid1", 00:20:53.992 "superblock": true, 00:20:53.992 "num_base_bdevs": 4, 00:20:53.992 "num_base_bdevs_discovered": 2, 00:20:53.992 "num_base_bdevs_operational": 2, 00:20:53.992 "base_bdevs_list": [ 00:20:53.992 { 00:20:53.992 "name": null, 00:20:53.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.992 "is_configured": false, 00:20:53.992 "data_offset": 2048, 00:20:53.992 "data_size": 63488 00:20:53.992 }, 00:20:53.992 { 00:20:53.992 "name": null, 00:20:53.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.992 "is_configured": false, 00:20:53.992 "data_offset": 2048, 00:20:53.992 "data_size": 63488 00:20:53.992 }, 00:20:53.992 { 00:20:53.992 "name": "BaseBdev3", 00:20:53.992 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:53.992 "is_configured": true, 00:20:53.992 "data_offset": 2048, 00:20:53.992 "data_size": 63488 00:20:53.992 }, 00:20:53.992 { 00:20:53.992 "name": "BaseBdev4", 00:20:53.992 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:53.992 "is_configured": true, 00:20:53.992 "data_offset": 2048, 00:20:53.992 "data_size": 63488 00:20:53.992 } 00:20:53.992 ] 00:20:53.992 }' 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:53.992 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:54.250 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:54.250 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:54.250 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:54.507 [2024-07-15 13:43:41.951671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:54.507 [2024-07-15 13:43:41.951711] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:54.507 [2024-07-15 13:43:41.951725] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bff50 00:20:54.507 [2024-07-15 13:43:41.951734] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:54.507 [2024-07-15 13:43:41.952003] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:54.507 [2024-07-15 13:43:41.952017] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:54.507 [2024-07-15 13:43:41.952066] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:54.507 [2024-07-15 13:43:41.952076] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:54.507 [2024-07-15 13:43:41.952083] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:54.507 BaseBdev1 00:20:54.507 13:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.439 13:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.704 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.704 "name": "raid_bdev1", 00:20:55.704 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:55.704 "strip_size_kb": 0, 00:20:55.704 "state": "online", 00:20:55.704 "raid_level": "raid1", 00:20:55.704 "superblock": true, 00:20:55.704 "num_base_bdevs": 4, 00:20:55.704 "num_base_bdevs_discovered": 2, 00:20:55.704 "num_base_bdevs_operational": 2, 00:20:55.704 "base_bdevs_list": [ 00:20:55.704 { 00:20:55.704 "name": null, 00:20:55.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.704 "is_configured": false, 00:20:55.704 "data_offset": 2048, 00:20:55.704 "data_size": 63488 00:20:55.704 }, 00:20:55.704 { 00:20:55.704 "name": null, 00:20:55.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.704 "is_configured": false, 00:20:55.704 "data_offset": 2048, 00:20:55.704 "data_size": 63488 00:20:55.704 }, 00:20:55.704 { 00:20:55.704 "name": "BaseBdev3", 00:20:55.704 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:55.704 "is_configured": true, 00:20:55.704 "data_offset": 2048, 00:20:55.704 "data_size": 63488 00:20:55.704 }, 00:20:55.704 { 00:20:55.704 "name": "BaseBdev4", 00:20:55.704 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:55.704 "is_configured": true, 00:20:55.704 "data_offset": 2048, 00:20:55.704 "data_size": 63488 00:20:55.704 } 00:20:55.704 ] 00:20:55.704 }' 00:20:55.704 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.704 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:56.269 "name": "raid_bdev1", 00:20:56.269 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:56.269 "strip_size_kb": 0, 00:20:56.269 "state": "online", 00:20:56.269 "raid_level": "raid1", 00:20:56.269 "superblock": true, 00:20:56.269 "num_base_bdevs": 4, 00:20:56.269 "num_base_bdevs_discovered": 2, 00:20:56.269 "num_base_bdevs_operational": 2, 00:20:56.269 "base_bdevs_list": [ 00:20:56.269 { 00:20:56.269 "name": null, 00:20:56.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.269 "is_configured": false, 00:20:56.269 "data_offset": 2048, 00:20:56.269 "data_size": 63488 00:20:56.269 }, 00:20:56.269 { 00:20:56.269 "name": null, 00:20:56.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.269 "is_configured": false, 00:20:56.269 "data_offset": 2048, 00:20:56.269 "data_size": 63488 00:20:56.269 }, 00:20:56.269 { 00:20:56.269 "name": "BaseBdev3", 00:20:56.269 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:56.269 "is_configured": true, 00:20:56.269 "data_offset": 2048, 00:20:56.269 "data_size": 63488 00:20:56.269 }, 00:20:56.269 { 00:20:56.269 "name": "BaseBdev4", 00:20:56.269 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:56.269 "is_configured": true, 00:20:56.269 "data_offset": 2048, 00:20:56.269 "data_size": 63488 00:20:56.269 } 00:20:56.269 ] 00:20:56.269 }' 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:56.269 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:56.526 13:43:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:56.526 [2024-07-15 13:43:44.081157] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:56.526 [2024-07-15 13:43:44.081258] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:56.526 [2024-07-15 13:43:44.081269] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:56.526 request: 00:20:56.526 { 00:20:56.526 "base_bdev": "BaseBdev1", 00:20:56.526 "raid_bdev": "raid_bdev1", 00:20:56.526 "method": "bdev_raid_add_base_bdev", 00:20:56.526 "req_id": 1 00:20:56.526 } 00:20:56.526 Got JSON-RPC error response 00:20:56.526 response: 00:20:56.526 { 00:20:56.526 "code": -22, 00:20:56.526 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:56.526 } 00:20:56.526 13:43:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:20:56.526 13:43:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:56.526 13:43:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:56.526 13:43:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:56.526 13:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.897 "name": "raid_bdev1", 00:20:57.897 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:57.897 "strip_size_kb": 0, 00:20:57.897 "state": "online", 00:20:57.897 "raid_level": "raid1", 00:20:57.897 "superblock": true, 00:20:57.897 "num_base_bdevs": 4, 00:20:57.897 "num_base_bdevs_discovered": 2, 00:20:57.897 "num_base_bdevs_operational": 2, 00:20:57.897 "base_bdevs_list": [ 00:20:57.897 { 00:20:57.897 "name": null, 00:20:57.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.897 "is_configured": false, 00:20:57.897 "data_offset": 2048, 00:20:57.897 "data_size": 63488 00:20:57.897 }, 00:20:57.897 { 00:20:57.897 "name": null, 00:20:57.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.897 "is_configured": false, 00:20:57.897 "data_offset": 2048, 00:20:57.897 "data_size": 63488 00:20:57.897 }, 00:20:57.897 { 00:20:57.897 "name": "BaseBdev3", 00:20:57.897 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:57.897 "is_configured": true, 00:20:57.897 "data_offset": 2048, 00:20:57.897 "data_size": 63488 00:20:57.897 }, 00:20:57.897 { 00:20:57.897 "name": "BaseBdev4", 00:20:57.897 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:57.897 "is_configured": true, 00:20:57.897 "data_offset": 2048, 00:20:57.897 "data_size": 63488 00:20:57.897 } 00:20:57.897 ] 00:20:57.897 }' 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.897 13:43:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:58.463 "name": "raid_bdev1", 00:20:58.463 "uuid": "2b72586f-e674-43c6-9636-08ec3dcfe983", 00:20:58.463 "strip_size_kb": 0, 00:20:58.463 "state": "online", 00:20:58.463 "raid_level": "raid1", 00:20:58.463 "superblock": true, 00:20:58.463 "num_base_bdevs": 4, 00:20:58.463 "num_base_bdevs_discovered": 2, 00:20:58.463 "num_base_bdevs_operational": 2, 00:20:58.463 "base_bdevs_list": [ 00:20:58.463 { 00:20:58.463 "name": null, 00:20:58.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.463 "is_configured": false, 00:20:58.463 "data_offset": 2048, 00:20:58.463 "data_size": 63488 00:20:58.463 }, 00:20:58.463 { 00:20:58.463 "name": null, 00:20:58.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.463 "is_configured": false, 00:20:58.463 "data_offset": 2048, 00:20:58.463 "data_size": 63488 00:20:58.463 }, 00:20:58.463 { 00:20:58.463 "name": "BaseBdev3", 00:20:58.463 "uuid": "54233896-9b6d-5b89-9146-2811e7e3f179", 00:20:58.463 "is_configured": true, 00:20:58.463 "data_offset": 2048, 00:20:58.463 "data_size": 63488 00:20:58.463 }, 00:20:58.463 { 00:20:58.463 "name": "BaseBdev4", 00:20:58.463 "uuid": "2d7cae87-37a4-58a2-9c38-cabffbe1e486", 00:20:58.463 "is_configured": true, 00:20:58.463 "data_offset": 2048, 00:20:58.463 "data_size": 63488 00:20:58.463 } 00:20:58.463 ] 00:20:58.463 }' 00:20:58.463 13:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 74248 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 74248 ']' 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 74248 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:58.463 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 74248 00:20:58.721 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:58.721 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:58.721 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 74248' 00:20:58.721 killing process with pid 74248 00:20:58.721 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 74248 00:20:58.721 Received shutdown signal, test time was about 60.000000 seconds 00:20:58.721 00:20:58.721 Latency(us) 00:20:58.721 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:58.721 =================================================================================================================== 00:20:58.721 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:58.721 [2024-07-15 13:43:46.103463] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:58.721 [2024-07-15 13:43:46.103530] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:58.721 [2024-07-15 13:43:46.103574] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:58.721 [2024-07-15 13:43:46.103583] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1665d10 name raid_bdev1, state offline 00:20:58.721 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 74248 00:20:58.721 [2024-07-15 13:43:46.150428] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:20:58.979 00:20:58.979 real 0m30.670s 00:20:58.979 user 0m43.446s 00:20:58.979 sys 0m5.678s 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:58.979 ************************************ 00:20:58.979 END TEST raid_rebuild_test_sb 00:20:58.979 ************************************ 00:20:58.979 13:43:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:58.979 13:43:46 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:20:58.979 13:43:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:58.979 13:43:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:58.979 13:43:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:58.979 ************************************ 00:20:58.979 START TEST raid_rebuild_test_io 00:20:58.979 ************************************ 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=78753 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 78753 /var/tmp/spdk-raid.sock 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 78753 ']' 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:58.979 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:58.979 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:58.979 [2024-07-15 13:43:46.498105] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:20:58.979 [2024-07-15 13:43:46.498148] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid78753 ] 00:20:58.979 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:58.979 Zero copy mechanism will not be used. 00:20:58.979 [2024-07-15 13:43:46.580682] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:59.237 [2024-07-15 13:43:46.664260] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:59.237 [2024-07-15 13:43:46.724266] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:59.237 [2024-07-15 13:43:46.724291] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:59.801 13:43:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:59.801 13:43:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:20:59.801 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:59.802 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:00.059 BaseBdev1_malloc 00:21:00.059 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:00.059 [2024-07-15 13:43:47.648246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:00.059 [2024-07-15 13:43:47.648286] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:00.059 [2024-07-15 13:43:47.648304] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14cc600 00:21:00.059 [2024-07-15 13:43:47.648312] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.059 [2024-07-15 13:43:47.649572] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.059 [2024-07-15 13:43:47.649597] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:00.059 BaseBdev1 00:21:00.059 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:00.059 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:00.317 BaseBdev2_malloc 00:21:00.317 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:00.574 [2024-07-15 13:43:47.990262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:00.574 [2024-07-15 13:43:47.990300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:00.574 [2024-07-15 13:43:47.990319] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14cd120 00:21:00.574 [2024-07-15 13:43:47.990327] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.574 [2024-07-15 13:43:47.991490] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.574 [2024-07-15 13:43:47.991513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:00.574 BaseBdev2 00:21:00.574 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:00.574 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:00.574 BaseBdev3_malloc 00:21:00.574 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:00.831 [2024-07-15 13:43:48.332053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:00.831 [2024-07-15 13:43:48.332093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:00.831 [2024-07-15 13:43:48.332109] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x167a1b0 00:21:00.832 [2024-07-15 13:43:48.332118] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.832 [2024-07-15 13:43:48.333274] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.832 [2024-07-15 13:43:48.333297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:00.832 BaseBdev3 00:21:00.832 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:00.832 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:01.089 BaseBdev4_malloc 00:21:01.089 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:01.089 [2024-07-15 13:43:48.678064] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:01.089 [2024-07-15 13:43:48.678102] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.089 [2024-07-15 13:43:48.678118] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1679390 00:21:01.089 [2024-07-15 13:43:48.678127] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.089 [2024-07-15 13:43:48.679249] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.089 [2024-07-15 13:43:48.679274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:01.089 BaseBdev4 00:21:01.089 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:01.347 spare_malloc 00:21:01.347 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:01.604 spare_delay 00:21:01.604 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:01.604 [2024-07-15 13:43:49.183047] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:01.604 [2024-07-15 13:43:49.183084] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.604 [2024-07-15 13:43:49.183100] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x167de70 00:21:01.604 [2024-07-15 13:43:49.183109] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.604 [2024-07-15 13:43:49.184236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.604 [2024-07-15 13:43:49.184259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:01.604 spare 00:21:01.604 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:01.862 [2024-07-15 13:43:49.343478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:01.862 [2024-07-15 13:43:49.344422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:01.862 [2024-07-15 13:43:49.344461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:01.862 [2024-07-15 13:43:49.344491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:01.862 [2024-07-15 13:43:49.344554] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15fd160 00:21:01.862 [2024-07-15 13:43:49.344560] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:01.862 [2024-07-15 13:43:49.344710] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16776d0 00:21:01.862 [2024-07-15 13:43:49.344810] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15fd160 00:21:01.862 [2024-07-15 13:43:49.344816] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15fd160 00:21:01.862 [2024-07-15 13:43:49.344896] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.862 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.119 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.120 "name": "raid_bdev1", 00:21:02.120 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:02.120 "strip_size_kb": 0, 00:21:02.120 "state": "online", 00:21:02.120 "raid_level": "raid1", 00:21:02.120 "superblock": false, 00:21:02.120 "num_base_bdevs": 4, 00:21:02.120 "num_base_bdevs_discovered": 4, 00:21:02.120 "num_base_bdevs_operational": 4, 00:21:02.120 "base_bdevs_list": [ 00:21:02.120 { 00:21:02.120 "name": "BaseBdev1", 00:21:02.120 "uuid": "a11a5396-0427-5fa4-a2c1-fb3522c59364", 00:21:02.120 "is_configured": true, 00:21:02.120 "data_offset": 0, 00:21:02.120 "data_size": 65536 00:21:02.120 }, 00:21:02.120 { 00:21:02.120 "name": "BaseBdev2", 00:21:02.120 "uuid": "acc6b68f-66aa-5d13-9c50-933decc91dd2", 00:21:02.120 "is_configured": true, 00:21:02.120 "data_offset": 0, 00:21:02.120 "data_size": 65536 00:21:02.120 }, 00:21:02.120 { 00:21:02.120 "name": "BaseBdev3", 00:21:02.120 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:02.120 "is_configured": true, 00:21:02.120 "data_offset": 0, 00:21:02.120 "data_size": 65536 00:21:02.120 }, 00:21:02.120 { 00:21:02.120 "name": "BaseBdev4", 00:21:02.120 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:02.120 "is_configured": true, 00:21:02.120 "data_offset": 0, 00:21:02.120 "data_size": 65536 00:21:02.120 } 00:21:02.120 ] 00:21:02.120 }' 00:21:02.120 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.120 13:43:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:02.682 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:02.682 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:02.682 [2024-07-15 13:43:50.189876] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:02.682 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:02.682 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.683 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:02.939 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:02.939 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:21:02.939 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:02.939 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:02.939 [2024-07-15 13:43:50.476304] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1603230 00:21:02.939 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:02.939 Zero copy mechanism will not be used. 00:21:02.939 Running I/O for 60 seconds... 00:21:03.195 [2024-07-15 13:43:50.568257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:03.195 [2024-07-15 13:43:50.573354] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1603230 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.195 "name": "raid_bdev1", 00:21:03.195 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:03.195 "strip_size_kb": 0, 00:21:03.195 "state": "online", 00:21:03.195 "raid_level": "raid1", 00:21:03.195 "superblock": false, 00:21:03.195 "num_base_bdevs": 4, 00:21:03.195 "num_base_bdevs_discovered": 3, 00:21:03.195 "num_base_bdevs_operational": 3, 00:21:03.195 "base_bdevs_list": [ 00:21:03.195 { 00:21:03.195 "name": null, 00:21:03.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.195 "is_configured": false, 00:21:03.195 "data_offset": 0, 00:21:03.195 "data_size": 65536 00:21:03.195 }, 00:21:03.195 { 00:21:03.195 "name": "BaseBdev2", 00:21:03.195 "uuid": "acc6b68f-66aa-5d13-9c50-933decc91dd2", 00:21:03.195 "is_configured": true, 00:21:03.195 "data_offset": 0, 00:21:03.195 "data_size": 65536 00:21:03.195 }, 00:21:03.195 { 00:21:03.195 "name": "BaseBdev3", 00:21:03.195 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:03.195 "is_configured": true, 00:21:03.195 "data_offset": 0, 00:21:03.195 "data_size": 65536 00:21:03.195 }, 00:21:03.195 { 00:21:03.195 "name": "BaseBdev4", 00:21:03.195 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:03.195 "is_configured": true, 00:21:03.195 "data_offset": 0, 00:21:03.195 "data_size": 65536 00:21:03.195 } 00:21:03.195 ] 00:21:03.195 }' 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.195 13:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:03.759 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:04.016 [2024-07-15 13:43:51.388431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:04.016 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:04.016 [2024-07-15 13:43:51.438959] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11d3b40 00:21:04.016 [2024-07-15 13:43:51.440895] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:04.016 [2024-07-15 13:43:51.557516] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:04.016 [2024-07-15 13:43:51.557826] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:04.273 [2024-07-15 13:43:51.770115] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:04.273 [2024-07-15 13:43:51.770721] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:04.529 [2024-07-15 13:43:52.135239] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:04.529 [2024-07-15 13:43:52.135640] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:04.786 [2024-07-15 13:43:52.252409] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:04.786 [2024-07-15 13:43:52.252572] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:05.042 [2024-07-15 13:43:52.586714] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:05.042 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:05.042 "name": "raid_bdev1", 00:21:05.042 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:05.042 "strip_size_kb": 0, 00:21:05.042 "state": "online", 00:21:05.042 "raid_level": "raid1", 00:21:05.042 "superblock": false, 00:21:05.042 "num_base_bdevs": 4, 00:21:05.042 "num_base_bdevs_discovered": 4, 00:21:05.043 "num_base_bdevs_operational": 4, 00:21:05.043 "process": { 00:21:05.043 "type": "rebuild", 00:21:05.043 "target": "spare", 00:21:05.043 "progress": { 00:21:05.043 "blocks": 14336, 00:21:05.043 "percent": 21 00:21:05.043 } 00:21:05.043 }, 00:21:05.043 "base_bdevs_list": [ 00:21:05.043 { 00:21:05.043 "name": "spare", 00:21:05.043 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:05.043 "is_configured": true, 00:21:05.043 "data_offset": 0, 00:21:05.043 "data_size": 65536 00:21:05.043 }, 00:21:05.043 { 00:21:05.043 "name": "BaseBdev2", 00:21:05.043 "uuid": "acc6b68f-66aa-5d13-9c50-933decc91dd2", 00:21:05.043 "is_configured": true, 00:21:05.043 "data_offset": 0, 00:21:05.043 "data_size": 65536 00:21:05.043 }, 00:21:05.043 { 00:21:05.043 "name": "BaseBdev3", 00:21:05.043 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:05.043 "is_configured": true, 00:21:05.043 "data_offset": 0, 00:21:05.043 "data_size": 65536 00:21:05.043 }, 00:21:05.043 { 00:21:05.043 "name": "BaseBdev4", 00:21:05.043 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:05.043 "is_configured": true, 00:21:05.043 "data_offset": 0, 00:21:05.043 "data_size": 65536 00:21:05.043 } 00:21:05.043 ] 00:21:05.043 }' 00:21:05.043 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:05.300 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:05.300 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:05.300 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:05.300 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:05.300 [2024-07-15 13:43:52.802525] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:05.300 [2024-07-15 13:43:52.803105] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:05.300 [2024-07-15 13:43:52.869863] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:05.557 [2024-07-15 13:43:52.927954] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:05.557 [2024-07-15 13:43:52.945133] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:05.557 [2024-07-15 13:43:52.954473] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:05.557 [2024-07-15 13:43:52.954497] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:05.557 [2024-07-15 13:43:52.954506] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:05.557 [2024-07-15 13:43:52.959855] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1603230 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.557 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:05.813 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.813 "name": "raid_bdev1", 00:21:05.813 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:05.813 "strip_size_kb": 0, 00:21:05.813 "state": "online", 00:21:05.813 "raid_level": "raid1", 00:21:05.813 "superblock": false, 00:21:05.813 "num_base_bdevs": 4, 00:21:05.813 "num_base_bdevs_discovered": 3, 00:21:05.813 "num_base_bdevs_operational": 3, 00:21:05.813 "base_bdevs_list": [ 00:21:05.813 { 00:21:05.813 "name": null, 00:21:05.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:05.813 "is_configured": false, 00:21:05.813 "data_offset": 0, 00:21:05.813 "data_size": 65536 00:21:05.813 }, 00:21:05.813 { 00:21:05.813 "name": "BaseBdev2", 00:21:05.813 "uuid": "acc6b68f-66aa-5d13-9c50-933decc91dd2", 00:21:05.813 "is_configured": true, 00:21:05.813 "data_offset": 0, 00:21:05.813 "data_size": 65536 00:21:05.813 }, 00:21:05.813 { 00:21:05.813 "name": "BaseBdev3", 00:21:05.813 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:05.813 "is_configured": true, 00:21:05.813 "data_offset": 0, 00:21:05.813 "data_size": 65536 00:21:05.813 }, 00:21:05.813 { 00:21:05.813 "name": "BaseBdev4", 00:21:05.813 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:05.813 "is_configured": true, 00:21:05.813 "data_offset": 0, 00:21:05.813 "data_size": 65536 00:21:05.813 } 00:21:05.813 ] 00:21:05.813 }' 00:21:05.813 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.813 13:43:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:06.377 "name": "raid_bdev1", 00:21:06.377 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:06.377 "strip_size_kb": 0, 00:21:06.377 "state": "online", 00:21:06.377 "raid_level": "raid1", 00:21:06.377 "superblock": false, 00:21:06.377 "num_base_bdevs": 4, 00:21:06.377 "num_base_bdevs_discovered": 3, 00:21:06.377 "num_base_bdevs_operational": 3, 00:21:06.377 "base_bdevs_list": [ 00:21:06.377 { 00:21:06.377 "name": null, 00:21:06.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.377 "is_configured": false, 00:21:06.377 "data_offset": 0, 00:21:06.377 "data_size": 65536 00:21:06.377 }, 00:21:06.377 { 00:21:06.377 "name": "BaseBdev2", 00:21:06.377 "uuid": "acc6b68f-66aa-5d13-9c50-933decc91dd2", 00:21:06.377 "is_configured": true, 00:21:06.377 "data_offset": 0, 00:21:06.377 "data_size": 65536 00:21:06.377 }, 00:21:06.377 { 00:21:06.377 "name": "BaseBdev3", 00:21:06.377 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:06.377 "is_configured": true, 00:21:06.377 "data_offset": 0, 00:21:06.377 "data_size": 65536 00:21:06.377 }, 00:21:06.377 { 00:21:06.377 "name": "BaseBdev4", 00:21:06.377 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:06.377 "is_configured": true, 00:21:06.377 "data_offset": 0, 00:21:06.377 "data_size": 65536 00:21:06.377 } 00:21:06.377 ] 00:21:06.377 }' 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:06.377 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:06.634 [2024-07-15 13:43:54.141396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:06.634 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:06.634 [2024-07-15 13:43:54.185015] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1603a10 00:21:06.634 [2024-07-15 13:43:54.186145] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:06.891 [2024-07-15 13:43:54.300672] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:06.891 [2024-07-15 13:43:54.301007] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:07.148 [2024-07-15 13:43:54.524217] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:07.148 [2024-07-15 13:43:54.524898] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:07.405 [2024-07-15 13:43:55.010777] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:07.405 [2024-07-15 13:43:55.010950] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:07.663 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:07.663 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:07.663 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:07.663 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:07.663 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:07.663 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.663 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:07.920 "name": "raid_bdev1", 00:21:07.920 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:07.920 "strip_size_kb": 0, 00:21:07.920 "state": "online", 00:21:07.920 "raid_level": "raid1", 00:21:07.920 "superblock": false, 00:21:07.920 "num_base_bdevs": 4, 00:21:07.920 "num_base_bdevs_discovered": 4, 00:21:07.920 "num_base_bdevs_operational": 4, 00:21:07.920 "process": { 00:21:07.920 "type": "rebuild", 00:21:07.920 "target": "spare", 00:21:07.920 "progress": { 00:21:07.920 "blocks": 12288, 00:21:07.920 "percent": 18 00:21:07.920 } 00:21:07.920 }, 00:21:07.920 "base_bdevs_list": [ 00:21:07.920 { 00:21:07.920 "name": "spare", 00:21:07.920 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:07.920 "is_configured": true, 00:21:07.920 "data_offset": 0, 00:21:07.920 "data_size": 65536 00:21:07.920 }, 00:21:07.920 { 00:21:07.920 "name": "BaseBdev2", 00:21:07.920 "uuid": "acc6b68f-66aa-5d13-9c50-933decc91dd2", 00:21:07.920 "is_configured": true, 00:21:07.920 "data_offset": 0, 00:21:07.920 "data_size": 65536 00:21:07.920 }, 00:21:07.920 { 00:21:07.920 "name": "BaseBdev3", 00:21:07.920 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:07.920 "is_configured": true, 00:21:07.920 "data_offset": 0, 00:21:07.920 "data_size": 65536 00:21:07.920 }, 00:21:07.920 { 00:21:07.920 "name": "BaseBdev4", 00:21:07.920 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:07.920 "is_configured": true, 00:21:07.920 "data_offset": 0, 00:21:07.920 "data_size": 65536 00:21:07.920 } 00:21:07.920 ] 00:21:07.920 }' 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:21:07.920 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:08.178 [2024-07-15 13:43:55.612492] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:08.178 [2024-07-15 13:43:55.698034] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1603230 00:21:08.178 [2024-07-15 13:43:55.698055] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1603a10 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.178 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:08.435 [2024-07-15 13:43:55.822675] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:21:08.435 [2024-07-15 13:43:55.950133] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:08.435 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:08.435 "name": "raid_bdev1", 00:21:08.435 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:08.435 "strip_size_kb": 0, 00:21:08.435 "state": "online", 00:21:08.435 "raid_level": "raid1", 00:21:08.435 "superblock": false, 00:21:08.435 "num_base_bdevs": 4, 00:21:08.435 "num_base_bdevs_discovered": 3, 00:21:08.435 "num_base_bdevs_operational": 3, 00:21:08.435 "process": { 00:21:08.435 "type": "rebuild", 00:21:08.435 "target": "spare", 00:21:08.435 "progress": { 00:21:08.435 "blocks": 20480, 00:21:08.435 "percent": 31 00:21:08.435 } 00:21:08.435 }, 00:21:08.435 "base_bdevs_list": [ 00:21:08.435 { 00:21:08.435 "name": "spare", 00:21:08.435 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:08.435 "is_configured": true, 00:21:08.435 "data_offset": 0, 00:21:08.435 "data_size": 65536 00:21:08.435 }, 00:21:08.435 { 00:21:08.435 "name": null, 00:21:08.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:08.435 "is_configured": false, 00:21:08.435 "data_offset": 0, 00:21:08.435 "data_size": 65536 00:21:08.435 }, 00:21:08.435 { 00:21:08.435 "name": "BaseBdev3", 00:21:08.435 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:08.435 "is_configured": true, 00:21:08.435 "data_offset": 0, 00:21:08.435 "data_size": 65536 00:21:08.435 }, 00:21:08.435 { 00:21:08.435 "name": "BaseBdev4", 00:21:08.435 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:08.435 "is_configured": true, 00:21:08.435 "data_offset": 0, 00:21:08.435 "data_size": 65536 00:21:08.435 } 00:21:08.435 ] 00:21:08.435 }' 00:21:08.435 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:08.435 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:08.435 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=736 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.435 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:08.695 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:08.695 "name": "raid_bdev1", 00:21:08.695 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:08.695 "strip_size_kb": 0, 00:21:08.695 "state": "online", 00:21:08.695 "raid_level": "raid1", 00:21:08.695 "superblock": false, 00:21:08.695 "num_base_bdevs": 4, 00:21:08.695 "num_base_bdevs_discovered": 3, 00:21:08.695 "num_base_bdevs_operational": 3, 00:21:08.695 "process": { 00:21:08.695 "type": "rebuild", 00:21:08.695 "target": "spare", 00:21:08.695 "progress": { 00:21:08.695 "blocks": 24576, 00:21:08.695 "percent": 37 00:21:08.695 } 00:21:08.695 }, 00:21:08.695 "base_bdevs_list": [ 00:21:08.695 { 00:21:08.695 "name": "spare", 00:21:08.695 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:08.695 "is_configured": true, 00:21:08.695 "data_offset": 0, 00:21:08.695 "data_size": 65536 00:21:08.695 }, 00:21:08.695 { 00:21:08.695 "name": null, 00:21:08.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:08.695 "is_configured": false, 00:21:08.695 "data_offset": 0, 00:21:08.695 "data_size": 65536 00:21:08.695 }, 00:21:08.695 { 00:21:08.695 "name": "BaseBdev3", 00:21:08.695 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:08.695 "is_configured": true, 00:21:08.695 "data_offset": 0, 00:21:08.695 "data_size": 65536 00:21:08.695 }, 00:21:08.695 { 00:21:08.695 "name": "BaseBdev4", 00:21:08.695 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:08.695 "is_configured": true, 00:21:08.695 "data_offset": 0, 00:21:08.695 "data_size": 65536 00:21:08.695 } 00:21:08.695 ] 00:21:08.695 }' 00:21:08.695 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:08.695 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:08.695 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:08.695 [2024-07-15 13:43:56.296028] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:08.695 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:08.695 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:08.952 [2024-07-15 13:43:56.505844] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:21:08.952 [2024-07-15 13:43:56.506132] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:21:09.544 [2024-07-15 13:43:56.838178] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:21:09.820 [2024-07-15 13:43:57.303823] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:09.820 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.090 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:10.090 "name": "raid_bdev1", 00:21:10.090 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:10.090 "strip_size_kb": 0, 00:21:10.090 "state": "online", 00:21:10.090 "raid_level": "raid1", 00:21:10.090 "superblock": false, 00:21:10.090 "num_base_bdevs": 4, 00:21:10.090 "num_base_bdevs_discovered": 3, 00:21:10.090 "num_base_bdevs_operational": 3, 00:21:10.090 "process": { 00:21:10.090 "type": "rebuild", 00:21:10.090 "target": "spare", 00:21:10.090 "progress": { 00:21:10.090 "blocks": 43008, 00:21:10.090 "percent": 65 00:21:10.090 } 00:21:10.090 }, 00:21:10.090 "base_bdevs_list": [ 00:21:10.090 { 00:21:10.090 "name": "spare", 00:21:10.090 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:10.090 "is_configured": true, 00:21:10.090 "data_offset": 0, 00:21:10.090 "data_size": 65536 00:21:10.091 }, 00:21:10.091 { 00:21:10.091 "name": null, 00:21:10.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.091 "is_configured": false, 00:21:10.091 "data_offset": 0, 00:21:10.091 "data_size": 65536 00:21:10.091 }, 00:21:10.091 { 00:21:10.091 "name": "BaseBdev3", 00:21:10.091 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:10.091 "is_configured": true, 00:21:10.091 "data_offset": 0, 00:21:10.091 "data_size": 65536 00:21:10.091 }, 00:21:10.091 { 00:21:10.091 "name": "BaseBdev4", 00:21:10.091 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:10.091 "is_configured": true, 00:21:10.091 "data_offset": 0, 00:21:10.091 "data_size": 65536 00:21:10.091 } 00:21:10.091 ] 00:21:10.091 }' 00:21:10.091 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:10.091 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:10.091 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:10.091 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:10.091 13:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:10.348 [2024-07-15 13:43:57.844773] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:21:10.348 [2024-07-15 13:43:57.957352] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:21:10.348 [2024-07-15 13:43:57.957511] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:21:10.914 [2024-07-15 13:43:58.300506] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:21:10.914 [2024-07-15 13:43:58.301155] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:11.173 "name": "raid_bdev1", 00:21:11.173 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:11.173 "strip_size_kb": 0, 00:21:11.173 "state": "online", 00:21:11.173 "raid_level": "raid1", 00:21:11.173 "superblock": false, 00:21:11.173 "num_base_bdevs": 4, 00:21:11.173 "num_base_bdevs_discovered": 3, 00:21:11.173 "num_base_bdevs_operational": 3, 00:21:11.173 "process": { 00:21:11.173 "type": "rebuild", 00:21:11.173 "target": "spare", 00:21:11.173 "progress": { 00:21:11.173 "blocks": 61440, 00:21:11.173 "percent": 93 00:21:11.173 } 00:21:11.173 }, 00:21:11.173 "base_bdevs_list": [ 00:21:11.173 { 00:21:11.173 "name": "spare", 00:21:11.173 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:11.173 "is_configured": true, 00:21:11.173 "data_offset": 0, 00:21:11.173 "data_size": 65536 00:21:11.173 }, 00:21:11.173 { 00:21:11.173 "name": null, 00:21:11.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:11.173 "is_configured": false, 00:21:11.173 "data_offset": 0, 00:21:11.173 "data_size": 65536 00:21:11.173 }, 00:21:11.173 { 00:21:11.173 "name": "BaseBdev3", 00:21:11.173 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:11.173 "is_configured": true, 00:21:11.173 "data_offset": 0, 00:21:11.173 "data_size": 65536 00:21:11.173 }, 00:21:11.173 { 00:21:11.173 "name": "BaseBdev4", 00:21:11.173 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:11.173 "is_configured": true, 00:21:11.173 "data_offset": 0, 00:21:11.173 "data_size": 65536 00:21:11.173 } 00:21:11.173 ] 00:21:11.173 }' 00:21:11.173 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:11.431 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:11.431 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:11.431 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:11.431 13:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:11.431 [2024-07-15 13:43:58.854234] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:11.431 [2024-07-15 13:43:58.954537] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:11.431 [2024-07-15 13:43:58.955635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.363 13:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:12.621 "name": "raid_bdev1", 00:21:12.621 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:12.621 "strip_size_kb": 0, 00:21:12.621 "state": "online", 00:21:12.621 "raid_level": "raid1", 00:21:12.621 "superblock": false, 00:21:12.621 "num_base_bdevs": 4, 00:21:12.621 "num_base_bdevs_discovered": 3, 00:21:12.621 "num_base_bdevs_operational": 3, 00:21:12.621 "base_bdevs_list": [ 00:21:12.621 { 00:21:12.621 "name": "spare", 00:21:12.621 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:12.621 "is_configured": true, 00:21:12.621 "data_offset": 0, 00:21:12.621 "data_size": 65536 00:21:12.621 }, 00:21:12.621 { 00:21:12.621 "name": null, 00:21:12.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.621 "is_configured": false, 00:21:12.621 "data_offset": 0, 00:21:12.621 "data_size": 65536 00:21:12.621 }, 00:21:12.621 { 00:21:12.621 "name": "BaseBdev3", 00:21:12.621 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:12.621 "is_configured": true, 00:21:12.621 "data_offset": 0, 00:21:12.621 "data_size": 65536 00:21:12.621 }, 00:21:12.621 { 00:21:12.621 "name": "BaseBdev4", 00:21:12.621 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:12.621 "is_configured": true, 00:21:12.621 "data_offset": 0, 00:21:12.621 "data_size": 65536 00:21:12.621 } 00:21:12.621 ] 00:21:12.621 }' 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.621 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.879 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:12.879 "name": "raid_bdev1", 00:21:12.879 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:12.879 "strip_size_kb": 0, 00:21:12.879 "state": "online", 00:21:12.879 "raid_level": "raid1", 00:21:12.879 "superblock": false, 00:21:12.879 "num_base_bdevs": 4, 00:21:12.879 "num_base_bdevs_discovered": 3, 00:21:12.879 "num_base_bdevs_operational": 3, 00:21:12.879 "base_bdevs_list": [ 00:21:12.879 { 00:21:12.879 "name": "spare", 00:21:12.879 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:12.879 "is_configured": true, 00:21:12.879 "data_offset": 0, 00:21:12.879 "data_size": 65536 00:21:12.879 }, 00:21:12.879 { 00:21:12.879 "name": null, 00:21:12.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.879 "is_configured": false, 00:21:12.879 "data_offset": 0, 00:21:12.879 "data_size": 65536 00:21:12.879 }, 00:21:12.879 { 00:21:12.879 "name": "BaseBdev3", 00:21:12.880 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:12.880 "is_configured": true, 00:21:12.880 "data_offset": 0, 00:21:12.880 "data_size": 65536 00:21:12.880 }, 00:21:12.880 { 00:21:12.880 "name": "BaseBdev4", 00:21:12.880 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:12.880 "is_configured": true, 00:21:12.880 "data_offset": 0, 00:21:12.880 "data_size": 65536 00:21:12.880 } 00:21:12.880 ] 00:21:12.880 }' 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.880 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.138 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.138 "name": "raid_bdev1", 00:21:13.138 "uuid": "0f7045ac-5dbd-4bb4-870f-acabd86b22a2", 00:21:13.138 "strip_size_kb": 0, 00:21:13.138 "state": "online", 00:21:13.138 "raid_level": "raid1", 00:21:13.138 "superblock": false, 00:21:13.138 "num_base_bdevs": 4, 00:21:13.138 "num_base_bdevs_discovered": 3, 00:21:13.138 "num_base_bdevs_operational": 3, 00:21:13.138 "base_bdevs_list": [ 00:21:13.138 { 00:21:13.138 "name": "spare", 00:21:13.138 "uuid": "d3500013-1edd-5c43-a0fa-b24d77315bc0", 00:21:13.138 "is_configured": true, 00:21:13.138 "data_offset": 0, 00:21:13.138 "data_size": 65536 00:21:13.138 }, 00:21:13.138 { 00:21:13.138 "name": null, 00:21:13.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.138 "is_configured": false, 00:21:13.138 "data_offset": 0, 00:21:13.138 "data_size": 65536 00:21:13.138 }, 00:21:13.138 { 00:21:13.138 "name": "BaseBdev3", 00:21:13.138 "uuid": "327a7891-404b-56c2-945e-e62ed8928079", 00:21:13.138 "is_configured": true, 00:21:13.138 "data_offset": 0, 00:21:13.138 "data_size": 65536 00:21:13.138 }, 00:21:13.138 { 00:21:13.138 "name": "BaseBdev4", 00:21:13.138 "uuid": "6cf2c728-f9fb-5177-90bc-e1c48f96333c", 00:21:13.138 "is_configured": true, 00:21:13.138 "data_offset": 0, 00:21:13.138 "data_size": 65536 00:21:13.138 } 00:21:13.138 ] 00:21:13.138 }' 00:21:13.138 13:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.138 13:44:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:13.703 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:13.703 [2024-07-15 13:44:01.185759] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:13.703 [2024-07-15 13:44:01.185788] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:13.703 00:21:13.703 Latency(us) 00:21:13.703 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:13.703 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:13.703 raid_bdev1 : 10.74 101.42 304.26 0.00 0.00 13621.35 260.01 114887.46 00:21:13.703 =================================================================================================================== 00:21:13.703 Total : 101.42 304.26 0.00 0.00 13621.35 260.01 114887.46 00:21:13.703 [2024-07-15 13:44:01.244606] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:13.703 [2024-07-15 13:44:01.244627] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:13.703 [2024-07-15 13:44:01.244689] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:13.703 [2024-07-15 13:44:01.244696] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15fd160 name raid_bdev1, state offline 00:21:13.703 0 00:21:13.703 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.703 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:13.961 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:14.229 /dev/nbd0 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:14.229 1+0 records in 00:21:14.229 1+0 records out 00:21:14.229 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233047 s, 17.6 MB/s 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:21:14.229 /dev/nbd1 00:21:14.229 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:14.488 1+0 records in 00:21:14.488 1+0 records out 00:21:14.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282477 s, 14.5 MB/s 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:14.488 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:14.489 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:14.489 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:14.489 13:44:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:21:14.747 /dev/nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:14.747 1+0 records in 00:21:14.747 1+0 records out 00:21:14.747 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000197298 s, 20.8 MB/s 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:21:14.747 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:15.005 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 78753 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 78753 ']' 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 78753 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 78753 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 78753' 00:21:15.263 killing process with pid 78753 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 78753 00:21:15.263 Received shutdown signal, test time was about 12.329398 seconds 00:21:15.263 00:21:15.263 Latency(us) 00:21:15.263 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:15.263 =================================================================================================================== 00:21:15.263 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:15.263 [2024-07-15 13:44:02.837573] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:15.263 13:44:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 78753 00:21:15.263 [2024-07-15 13:44:02.877775] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:15.520 13:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:15.520 00:21:15.520 real 0m16.654s 00:21:15.520 user 0m24.620s 00:21:15.520 sys 0m2.955s 00:21:15.520 13:44:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:15.520 13:44:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:15.520 ************************************ 00:21:15.520 END TEST raid_rebuild_test_io 00:21:15.520 ************************************ 00:21:15.520 13:44:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:15.520 13:44:03 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:21:15.520 13:44:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:15.520 13:44:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:15.520 13:44:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:15.776 ************************************ 00:21:15.776 START TEST raid_rebuild_test_sb_io 00:21:15.776 ************************************ 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=81146 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 81146 /var/tmp/spdk-raid.sock 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:15.776 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 81146 ']' 00:21:15.777 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:15.777 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:15.777 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:15.777 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:15.777 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:15.777 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:15.777 [2024-07-15 13:44:03.246747] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:21:15.777 [2024-07-15 13:44:03.246803] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid81146 ] 00:21:15.777 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:15.777 Zero copy mechanism will not be used. 00:21:15.777 [2024-07-15 13:44:03.335927] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:16.034 [2024-07-15 13:44:03.422141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:16.034 [2024-07-15 13:44:03.485845] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:16.034 [2024-07-15 13:44:03.485879] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:16.600 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:16.600 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:21:16.600 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:16.600 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:16.600 BaseBdev1_malloc 00:21:16.858 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:16.858 [2024-07-15 13:44:04.367010] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:16.858 [2024-07-15 13:44:04.367050] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:16.858 [2024-07-15 13:44:04.367068] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2480600 00:21:16.858 [2024-07-15 13:44:04.367077] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:16.858 [2024-07-15 13:44:04.368312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:16.858 [2024-07-15 13:44:04.368338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:16.858 BaseBdev1 00:21:16.858 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:16.858 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:17.117 BaseBdev2_malloc 00:21:17.117 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:17.117 [2024-07-15 13:44:04.719729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:17.117 [2024-07-15 13:44:04.719764] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.117 [2024-07-15 13:44:04.719785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2481120 00:21:17.117 [2024-07-15 13:44:04.719793] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.117 [2024-07-15 13:44:04.720784] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.117 [2024-07-15 13:44:04.720810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:17.117 BaseBdev2 00:21:17.375 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:17.375 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:17.375 BaseBdev3_malloc 00:21:17.375 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:17.633 [2024-07-15 13:44:05.092595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:17.633 [2024-07-15 13:44:05.092634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.633 [2024-07-15 13:44:05.092648] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x262e1b0 00:21:17.633 [2024-07-15 13:44:05.092656] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.633 [2024-07-15 13:44:05.093607] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.633 [2024-07-15 13:44:05.093630] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:17.633 BaseBdev3 00:21:17.633 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:17.633 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:17.891 BaseBdev4_malloc 00:21:17.891 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:17.891 [2024-07-15 13:44:05.445176] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:17.891 [2024-07-15 13:44:05.445213] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.891 [2024-07-15 13:44:05.445227] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x262d390 00:21:17.891 [2024-07-15 13:44:05.445235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.891 [2024-07-15 13:44:05.446234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.891 [2024-07-15 13:44:05.446256] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:17.891 BaseBdev4 00:21:17.891 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:18.150 spare_malloc 00:21:18.150 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:18.407 spare_delay 00:21:18.407 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:18.407 [2024-07-15 13:44:05.974160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:18.407 [2024-07-15 13:44:05.974198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:18.407 [2024-07-15 13:44:05.974211] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2631e70 00:21:18.407 [2024-07-15 13:44:05.974219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:18.407 [2024-07-15 13:44:05.975181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:18.407 [2024-07-15 13:44:05.975208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:18.407 spare 00:21:18.407 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:18.663 [2024-07-15 13:44:06.154663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:18.663 [2024-07-15 13:44:06.155567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:18.663 [2024-07-15 13:44:06.155607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:18.663 [2024-07-15 13:44:06.155638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:18.663 [2024-07-15 13:44:06.155784] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b1160 00:21:18.663 [2024-07-15 13:44:06.155793] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:18.663 [2024-07-15 13:44:06.155932] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x262b6d0 00:21:18.663 [2024-07-15 13:44:06.156048] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b1160 00:21:18.663 [2024-07-15 13:44:06.156056] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25b1160 00:21:18.663 [2024-07-15 13:44:06.156119] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.663 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.919 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.919 "name": "raid_bdev1", 00:21:18.919 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:18.919 "strip_size_kb": 0, 00:21:18.919 "state": "online", 00:21:18.919 "raid_level": "raid1", 00:21:18.919 "superblock": true, 00:21:18.919 "num_base_bdevs": 4, 00:21:18.919 "num_base_bdevs_discovered": 4, 00:21:18.919 "num_base_bdevs_operational": 4, 00:21:18.919 "base_bdevs_list": [ 00:21:18.919 { 00:21:18.919 "name": "BaseBdev1", 00:21:18.919 "uuid": "fe7b7784-5333-5f32-ab8e-d26a594b967c", 00:21:18.919 "is_configured": true, 00:21:18.919 "data_offset": 2048, 00:21:18.919 "data_size": 63488 00:21:18.919 }, 00:21:18.919 { 00:21:18.919 "name": "BaseBdev2", 00:21:18.919 "uuid": "e9ef33f6-7324-5b94-a340-7c8d9d4defee", 00:21:18.919 "is_configured": true, 00:21:18.919 "data_offset": 2048, 00:21:18.919 "data_size": 63488 00:21:18.919 }, 00:21:18.919 { 00:21:18.919 "name": "BaseBdev3", 00:21:18.919 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:18.919 "is_configured": true, 00:21:18.919 "data_offset": 2048, 00:21:18.919 "data_size": 63488 00:21:18.919 }, 00:21:18.919 { 00:21:18.919 "name": "BaseBdev4", 00:21:18.919 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:18.919 "is_configured": true, 00:21:18.919 "data_offset": 2048, 00:21:18.919 "data_size": 63488 00:21:18.919 } 00:21:18.919 ] 00:21:18.919 }' 00:21:18.919 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.919 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:19.484 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:19.484 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:19.484 [2024-07-15 13:44:07.009039] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:19.484 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:21:19.484 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.484 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:19.741 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:21:19.741 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:21:19.741 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:19.741 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:19.741 [2024-07-15 13:44:07.283607] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x247ff30 00:21:19.741 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:19.741 Zero copy mechanism will not be used. 00:21:19.741 Running I/O for 60 seconds... 00:21:19.999 [2024-07-15 13:44:07.372411] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:19.999 [2024-07-15 13:44:07.377690] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x247ff30 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.999 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.999 "name": "raid_bdev1", 00:21:19.999 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:19.999 "strip_size_kb": 0, 00:21:19.999 "state": "online", 00:21:19.999 "raid_level": "raid1", 00:21:19.999 "superblock": true, 00:21:19.999 "num_base_bdevs": 4, 00:21:19.999 "num_base_bdevs_discovered": 3, 00:21:19.999 "num_base_bdevs_operational": 3, 00:21:20.000 "base_bdevs_list": [ 00:21:20.000 { 00:21:20.000 "name": null, 00:21:20.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.000 "is_configured": false, 00:21:20.000 "data_offset": 2048, 00:21:20.000 "data_size": 63488 00:21:20.000 }, 00:21:20.000 { 00:21:20.000 "name": "BaseBdev2", 00:21:20.000 "uuid": "e9ef33f6-7324-5b94-a340-7c8d9d4defee", 00:21:20.000 "is_configured": true, 00:21:20.000 "data_offset": 2048, 00:21:20.000 "data_size": 63488 00:21:20.000 }, 00:21:20.000 { 00:21:20.000 "name": "BaseBdev3", 00:21:20.000 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:20.000 "is_configured": true, 00:21:20.000 "data_offset": 2048, 00:21:20.000 "data_size": 63488 00:21:20.000 }, 00:21:20.000 { 00:21:20.000 "name": "BaseBdev4", 00:21:20.000 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:20.000 "is_configured": true, 00:21:20.000 "data_offset": 2048, 00:21:20.000 "data_size": 63488 00:21:20.000 } 00:21:20.000 ] 00:21:20.000 }' 00:21:20.000 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.000 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:20.563 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:20.820 [2024-07-15 13:44:08.269449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:20.820 [2024-07-15 13:44:08.309541] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b3460 00:21:20.820 [2024-07-15 13:44:08.311283] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:20.820 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:20.820 [2024-07-15 13:44:08.413317] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:20.820 [2024-07-15 13:44:08.413794] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:21.077 [2024-07-15 13:44:08.622781] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:21.077 [2024-07-15 13:44:08.623391] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:21.642 [2024-07-15 13:44:08.958353] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:21.642 [2024-07-15 13:44:09.183340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:21.642 [2024-07-15 13:44:09.183599] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:21.900 "name": "raid_bdev1", 00:21:21.900 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:21.900 "strip_size_kb": 0, 00:21:21.900 "state": "online", 00:21:21.900 "raid_level": "raid1", 00:21:21.900 "superblock": true, 00:21:21.900 "num_base_bdevs": 4, 00:21:21.900 "num_base_bdevs_discovered": 4, 00:21:21.900 "num_base_bdevs_operational": 4, 00:21:21.900 "process": { 00:21:21.900 "type": "rebuild", 00:21:21.900 "target": "spare", 00:21:21.900 "progress": { 00:21:21.900 "blocks": 14336, 00:21:21.900 "percent": 22 00:21:21.900 } 00:21:21.900 }, 00:21:21.900 "base_bdevs_list": [ 00:21:21.900 { 00:21:21.900 "name": "spare", 00:21:21.900 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:21.900 "is_configured": true, 00:21:21.900 "data_offset": 2048, 00:21:21.900 "data_size": 63488 00:21:21.900 }, 00:21:21.900 { 00:21:21.900 "name": "BaseBdev2", 00:21:21.900 "uuid": "e9ef33f6-7324-5b94-a340-7c8d9d4defee", 00:21:21.900 "is_configured": true, 00:21:21.900 "data_offset": 2048, 00:21:21.900 "data_size": 63488 00:21:21.900 }, 00:21:21.900 { 00:21:21.900 "name": "BaseBdev3", 00:21:21.900 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:21.900 "is_configured": true, 00:21:21.900 "data_offset": 2048, 00:21:21.900 "data_size": 63488 00:21:21.900 }, 00:21:21.900 { 00:21:21.900 "name": "BaseBdev4", 00:21:21.900 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:21.900 "is_configured": true, 00:21:21.900 "data_offset": 2048, 00:21:21.900 "data_size": 63488 00:21:21.900 } 00:21:21.900 ] 00:21:21.900 }' 00:21:21.900 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:22.157 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:22.157 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:22.157 [2024-07-15 13:44:09.558730] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:22.157 [2024-07-15 13:44:09.559029] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:22.157 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:22.157 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:22.157 [2024-07-15 13:44:09.748934] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:22.415 [2024-07-15 13:44:09.790091] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:22.415 [2024-07-15 13:44:09.799137] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.415 [2024-07-15 13:44:09.799168] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:22.415 [2024-07-15 13:44:09.799178] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:22.415 [2024-07-15 13:44:09.814212] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x247ff30 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.415 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.415 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.415 "name": "raid_bdev1", 00:21:22.415 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:22.415 "strip_size_kb": 0, 00:21:22.415 "state": "online", 00:21:22.415 "raid_level": "raid1", 00:21:22.415 "superblock": true, 00:21:22.415 "num_base_bdevs": 4, 00:21:22.415 "num_base_bdevs_discovered": 3, 00:21:22.415 "num_base_bdevs_operational": 3, 00:21:22.415 "base_bdevs_list": [ 00:21:22.415 { 00:21:22.415 "name": null, 00:21:22.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.415 "is_configured": false, 00:21:22.415 "data_offset": 2048, 00:21:22.415 "data_size": 63488 00:21:22.415 }, 00:21:22.415 { 00:21:22.415 "name": "BaseBdev2", 00:21:22.415 "uuid": "e9ef33f6-7324-5b94-a340-7c8d9d4defee", 00:21:22.415 "is_configured": true, 00:21:22.415 "data_offset": 2048, 00:21:22.415 "data_size": 63488 00:21:22.415 }, 00:21:22.415 { 00:21:22.415 "name": "BaseBdev3", 00:21:22.415 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:22.415 "is_configured": true, 00:21:22.415 "data_offset": 2048, 00:21:22.415 "data_size": 63488 00:21:22.415 }, 00:21:22.415 { 00:21:22.415 "name": "BaseBdev4", 00:21:22.415 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:22.415 "is_configured": true, 00:21:22.415 "data_offset": 2048, 00:21:22.415 "data_size": 63488 00:21:22.415 } 00:21:22.415 ] 00:21:22.415 }' 00:21:22.415 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.415 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:22.982 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:22.982 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:22.982 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:22.982 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:22.982 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:22.982 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.982 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.240 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:23.240 "name": "raid_bdev1", 00:21:23.240 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:23.240 "strip_size_kb": 0, 00:21:23.240 "state": "online", 00:21:23.240 "raid_level": "raid1", 00:21:23.240 "superblock": true, 00:21:23.240 "num_base_bdevs": 4, 00:21:23.240 "num_base_bdevs_discovered": 3, 00:21:23.240 "num_base_bdevs_operational": 3, 00:21:23.240 "base_bdevs_list": [ 00:21:23.240 { 00:21:23.240 "name": null, 00:21:23.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.240 "is_configured": false, 00:21:23.240 "data_offset": 2048, 00:21:23.240 "data_size": 63488 00:21:23.240 }, 00:21:23.240 { 00:21:23.240 "name": "BaseBdev2", 00:21:23.240 "uuid": "e9ef33f6-7324-5b94-a340-7c8d9d4defee", 00:21:23.240 "is_configured": true, 00:21:23.240 "data_offset": 2048, 00:21:23.240 "data_size": 63488 00:21:23.240 }, 00:21:23.240 { 00:21:23.240 "name": "BaseBdev3", 00:21:23.240 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:23.240 "is_configured": true, 00:21:23.240 "data_offset": 2048, 00:21:23.240 "data_size": 63488 00:21:23.240 }, 00:21:23.240 { 00:21:23.240 "name": "BaseBdev4", 00:21:23.240 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:23.240 "is_configured": true, 00:21:23.240 "data_offset": 2048, 00:21:23.240 "data_size": 63488 00:21:23.240 } 00:21:23.240 ] 00:21:23.240 }' 00:21:23.240 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:23.240 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:23.240 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:23.240 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:23.240 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:23.499 [2024-07-15 13:44:10.966610] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:23.499 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:23.499 [2024-07-15 13:44:11.000658] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2625210 00:21:23.499 [2024-07-15 13:44:11.001756] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:23.499 [2024-07-15 13:44:11.114796] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:23.499 [2024-07-15 13:44:11.116050] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:23.758 [2024-07-15 13:44:11.330776] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:23.758 [2024-07-15 13:44:11.330933] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:24.323 [2024-07-15 13:44:11.663688] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:24.324 [2024-07-15 13:44:11.664745] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:24.581 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.581 [2024-07-15 13:44:12.082677] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:24.581 "name": "raid_bdev1", 00:21:24.581 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:24.581 "strip_size_kb": 0, 00:21:24.581 "state": "online", 00:21:24.581 "raid_level": "raid1", 00:21:24.581 "superblock": true, 00:21:24.581 "num_base_bdevs": 4, 00:21:24.581 "num_base_bdevs_discovered": 4, 00:21:24.581 "num_base_bdevs_operational": 4, 00:21:24.581 "process": { 00:21:24.581 "type": "rebuild", 00:21:24.581 "target": "spare", 00:21:24.581 "progress": { 00:21:24.581 "blocks": 14336, 00:21:24.581 "percent": 22 00:21:24.581 } 00:21:24.581 }, 00:21:24.581 "base_bdevs_list": [ 00:21:24.581 { 00:21:24.581 "name": "spare", 00:21:24.581 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:24.581 "is_configured": true, 00:21:24.581 "data_offset": 2048, 00:21:24.581 "data_size": 63488 00:21:24.581 }, 00:21:24.581 { 00:21:24.581 "name": "BaseBdev2", 00:21:24.581 "uuid": "e9ef33f6-7324-5b94-a340-7c8d9d4defee", 00:21:24.581 "is_configured": true, 00:21:24.581 "data_offset": 2048, 00:21:24.581 "data_size": 63488 00:21:24.581 }, 00:21:24.581 { 00:21:24.581 "name": "BaseBdev3", 00:21:24.581 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:24.581 "is_configured": true, 00:21:24.581 "data_offset": 2048, 00:21:24.581 "data_size": 63488 00:21:24.581 }, 00:21:24.581 { 00:21:24.581 "name": "BaseBdev4", 00:21:24.581 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:24.581 "is_configured": true, 00:21:24.581 "data_offset": 2048, 00:21:24.581 "data_size": 63488 00:21:24.581 } 00:21:24.581 ] 00:21:24.581 }' 00:21:24.581 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:24.839 [2024-07-15 13:44:12.214564] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:24.839 [2024-07-15 13:44:12.215182] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:24.839 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:24.839 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:24.840 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:24.840 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:24.840 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:24.840 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:24.840 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:21:24.840 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:24.840 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:21:24.840 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:24.840 [2024-07-15 13:44:12.429845] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:25.097 [2024-07-15 13:44:12.663314] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x247ff30 00:21:25.097 [2024-07-15 13:44:12.663342] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2625210 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.097 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.355 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:25.355 "name": "raid_bdev1", 00:21:25.355 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:25.355 "strip_size_kb": 0, 00:21:25.355 "state": "online", 00:21:25.355 "raid_level": "raid1", 00:21:25.355 "superblock": true, 00:21:25.355 "num_base_bdevs": 4, 00:21:25.355 "num_base_bdevs_discovered": 3, 00:21:25.355 "num_base_bdevs_operational": 3, 00:21:25.355 "process": { 00:21:25.355 "type": "rebuild", 00:21:25.355 "target": "spare", 00:21:25.355 "progress": { 00:21:25.355 "blocks": 20480, 00:21:25.355 "percent": 32 00:21:25.355 } 00:21:25.356 }, 00:21:25.356 "base_bdevs_list": [ 00:21:25.356 { 00:21:25.356 "name": "spare", 00:21:25.356 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:25.356 "is_configured": true, 00:21:25.356 "data_offset": 2048, 00:21:25.356 "data_size": 63488 00:21:25.356 }, 00:21:25.356 { 00:21:25.356 "name": null, 00:21:25.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.356 "is_configured": false, 00:21:25.356 "data_offset": 2048, 00:21:25.356 "data_size": 63488 00:21:25.356 }, 00:21:25.356 { 00:21:25.356 "name": "BaseBdev3", 00:21:25.356 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:25.356 "is_configured": true, 00:21:25.356 "data_offset": 2048, 00:21:25.356 "data_size": 63488 00:21:25.356 }, 00:21:25.356 { 00:21:25.356 "name": "BaseBdev4", 00:21:25.356 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:25.356 "is_configured": true, 00:21:25.356 "data_offset": 2048, 00:21:25.356 "data_size": 63488 00:21:25.356 } 00:21:25.356 ] 00:21:25.356 }' 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=752 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.356 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.614 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:25.614 "name": "raid_bdev1", 00:21:25.614 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:25.614 "strip_size_kb": 0, 00:21:25.614 "state": "online", 00:21:25.614 "raid_level": "raid1", 00:21:25.614 "superblock": true, 00:21:25.614 "num_base_bdevs": 4, 00:21:25.614 "num_base_bdevs_discovered": 3, 00:21:25.614 "num_base_bdevs_operational": 3, 00:21:25.614 "process": { 00:21:25.614 "type": "rebuild", 00:21:25.614 "target": "spare", 00:21:25.614 "progress": { 00:21:25.614 "blocks": 24576, 00:21:25.614 "percent": 38 00:21:25.614 } 00:21:25.614 }, 00:21:25.614 "base_bdevs_list": [ 00:21:25.614 { 00:21:25.614 "name": "spare", 00:21:25.614 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:25.614 "is_configured": true, 00:21:25.614 "data_offset": 2048, 00:21:25.614 "data_size": 63488 00:21:25.614 }, 00:21:25.614 { 00:21:25.614 "name": null, 00:21:25.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.614 "is_configured": false, 00:21:25.614 "data_offset": 2048, 00:21:25.614 "data_size": 63488 00:21:25.614 }, 00:21:25.614 { 00:21:25.614 "name": "BaseBdev3", 00:21:25.614 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:25.614 "is_configured": true, 00:21:25.614 "data_offset": 2048, 00:21:25.614 "data_size": 63488 00:21:25.614 }, 00:21:25.614 { 00:21:25.614 "name": "BaseBdev4", 00:21:25.614 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:25.614 "is_configured": true, 00:21:25.614 "data_offset": 2048, 00:21:25.614 "data_size": 63488 00:21:25.614 } 00:21:25.614 ] 00:21:25.614 }' 00:21:25.614 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:25.614 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:25.614 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:25.614 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:25.614 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:26.180 [2024-07-15 13:44:13.491592] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:21:26.180 [2024-07-15 13:44:13.491893] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:21:26.180 [2024-07-15 13:44:13.629672] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:21:26.438 [2024-07-15 13:44:13.950787] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:21:26.700 [2024-07-15 13:44:14.077617] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.700 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:26.958 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:26.958 "name": "raid_bdev1", 00:21:26.958 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:26.958 "strip_size_kb": 0, 00:21:26.958 "state": "online", 00:21:26.958 "raid_level": "raid1", 00:21:26.958 "superblock": true, 00:21:26.958 "num_base_bdevs": 4, 00:21:26.958 "num_base_bdevs_discovered": 3, 00:21:26.958 "num_base_bdevs_operational": 3, 00:21:26.958 "process": { 00:21:26.958 "type": "rebuild", 00:21:26.959 "target": "spare", 00:21:26.959 "progress": { 00:21:26.959 "blocks": 43008, 00:21:26.959 "percent": 67 00:21:26.959 } 00:21:26.959 }, 00:21:26.959 "base_bdevs_list": [ 00:21:26.959 { 00:21:26.959 "name": "spare", 00:21:26.959 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:26.959 "is_configured": true, 00:21:26.959 "data_offset": 2048, 00:21:26.959 "data_size": 63488 00:21:26.959 }, 00:21:26.959 { 00:21:26.959 "name": null, 00:21:26.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.959 "is_configured": false, 00:21:26.959 "data_offset": 2048, 00:21:26.959 "data_size": 63488 00:21:26.959 }, 00:21:26.959 { 00:21:26.959 "name": "BaseBdev3", 00:21:26.959 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:26.959 "is_configured": true, 00:21:26.959 "data_offset": 2048, 00:21:26.959 "data_size": 63488 00:21:26.959 }, 00:21:26.959 { 00:21:26.959 "name": "BaseBdev4", 00:21:26.959 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:26.959 "is_configured": true, 00:21:26.959 "data_offset": 2048, 00:21:26.959 "data_size": 63488 00:21:26.959 } 00:21:26.959 ] 00:21:26.959 }' 00:21:26.959 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:26.959 [2024-07-15 13:44:14.399209] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:21:26.959 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:26.959 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:26.959 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:26.959 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:27.217 [2024-07-15 13:44:14.602356] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:21:27.781 [2024-07-15 13:44:15.135817] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:21:27.781 [2024-07-15 13:44:15.136132] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:21:28.038 [2024-07-15 13:44:15.446704] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:28.038 [2024-07-15 13:44:15.478178] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.038 [2024-07-15 13:44:15.480193] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:28.038 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:28.038 "name": "raid_bdev1", 00:21:28.038 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:28.038 "strip_size_kb": 0, 00:21:28.038 "state": "online", 00:21:28.038 "raid_level": "raid1", 00:21:28.038 "superblock": true, 00:21:28.038 "num_base_bdevs": 4, 00:21:28.038 "num_base_bdevs_discovered": 3, 00:21:28.038 "num_base_bdevs_operational": 3, 00:21:28.038 "base_bdevs_list": [ 00:21:28.038 { 00:21:28.038 "name": "spare", 00:21:28.038 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:28.038 "is_configured": true, 00:21:28.038 "data_offset": 2048, 00:21:28.038 "data_size": 63488 00:21:28.038 }, 00:21:28.038 { 00:21:28.038 "name": null, 00:21:28.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.038 "is_configured": false, 00:21:28.038 "data_offset": 2048, 00:21:28.038 "data_size": 63488 00:21:28.038 }, 00:21:28.038 { 00:21:28.038 "name": "BaseBdev3", 00:21:28.038 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:28.038 "is_configured": true, 00:21:28.038 "data_offset": 2048, 00:21:28.039 "data_size": 63488 00:21:28.039 }, 00:21:28.039 { 00:21:28.039 "name": "BaseBdev4", 00:21:28.039 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:28.039 "is_configured": true, 00:21:28.039 "data_offset": 2048, 00:21:28.039 "data_size": 63488 00:21:28.039 } 00:21:28.039 ] 00:21:28.039 }' 00:21:28.039 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.296 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.553 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:28.553 "name": "raid_bdev1", 00:21:28.553 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:28.553 "strip_size_kb": 0, 00:21:28.554 "state": "online", 00:21:28.554 "raid_level": "raid1", 00:21:28.554 "superblock": true, 00:21:28.554 "num_base_bdevs": 4, 00:21:28.554 "num_base_bdevs_discovered": 3, 00:21:28.554 "num_base_bdevs_operational": 3, 00:21:28.554 "base_bdevs_list": [ 00:21:28.554 { 00:21:28.554 "name": "spare", 00:21:28.554 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:28.554 "is_configured": true, 00:21:28.554 "data_offset": 2048, 00:21:28.554 "data_size": 63488 00:21:28.554 }, 00:21:28.554 { 00:21:28.554 "name": null, 00:21:28.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.554 "is_configured": false, 00:21:28.554 "data_offset": 2048, 00:21:28.554 "data_size": 63488 00:21:28.554 }, 00:21:28.554 { 00:21:28.554 "name": "BaseBdev3", 00:21:28.554 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:28.554 "is_configured": true, 00:21:28.554 "data_offset": 2048, 00:21:28.554 "data_size": 63488 00:21:28.554 }, 00:21:28.554 { 00:21:28.554 "name": "BaseBdev4", 00:21:28.554 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:28.554 "is_configured": true, 00:21:28.554 "data_offset": 2048, 00:21:28.554 "data_size": 63488 00:21:28.554 } 00:21:28.554 ] 00:21:28.554 }' 00:21:28.554 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:28.554 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:28.554 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.554 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.812 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.812 "name": "raid_bdev1", 00:21:28.812 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:28.812 "strip_size_kb": 0, 00:21:28.812 "state": "online", 00:21:28.812 "raid_level": "raid1", 00:21:28.812 "superblock": true, 00:21:28.812 "num_base_bdevs": 4, 00:21:28.812 "num_base_bdevs_discovered": 3, 00:21:28.812 "num_base_bdevs_operational": 3, 00:21:28.812 "base_bdevs_list": [ 00:21:28.812 { 00:21:28.812 "name": "spare", 00:21:28.812 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:28.812 "is_configured": true, 00:21:28.812 "data_offset": 2048, 00:21:28.812 "data_size": 63488 00:21:28.812 }, 00:21:28.812 { 00:21:28.812 "name": null, 00:21:28.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.812 "is_configured": false, 00:21:28.812 "data_offset": 2048, 00:21:28.812 "data_size": 63488 00:21:28.812 }, 00:21:28.812 { 00:21:28.812 "name": "BaseBdev3", 00:21:28.812 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:28.812 "is_configured": true, 00:21:28.812 "data_offset": 2048, 00:21:28.812 "data_size": 63488 00:21:28.812 }, 00:21:28.812 { 00:21:28.812 "name": "BaseBdev4", 00:21:28.812 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:28.812 "is_configured": true, 00:21:28.812 "data_offset": 2048, 00:21:28.812 "data_size": 63488 00:21:28.812 } 00:21:28.812 ] 00:21:28.812 }' 00:21:28.812 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.812 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:29.376 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:29.376 [2024-07-15 13:44:16.852539] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:29.376 [2024-07-15 13:44:16.852570] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:29.376 00:21:29.376 Latency(us) 00:21:29.376 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:29.376 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:29.376 raid_bdev1 : 9.58 111.07 333.20 0.00 0.00 12084.08 251.10 116255.17 00:21:29.376 =================================================================================================================== 00:21:29.376 Total : 111.07 333.20 0.00 0.00 12084.08 251.10 116255.17 00:21:29.376 [2024-07-15 13:44:16.891684] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.376 [2024-07-15 13:44:16.891709] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:29.376 [2024-07-15 13:44:16.891774] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:29.376 [2024-07-15 13:44:16.891784] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b1160 name raid_bdev1, state offline 00:21:29.376 0 00:21:29.376 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.376 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:29.633 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:29.890 /dev/nbd0 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:29.890 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:29.891 1+0 records in 00:21:29.891 1+0 records out 00:21:29.891 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002602 s, 15.7 MB/s 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:29.891 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:21:29.891 /dev/nbd1 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:30.148 1+0 records in 00:21:30.148 1+0 records out 00:21:30.148 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257591 s, 15.9 MB/s 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:30.148 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:21:30.406 /dev/nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:30.406 1+0 records in 00:21:30.406 1+0 records out 00:21:30.406 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257675 s, 15.9 MB/s 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:30.406 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:30.406 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:30.406 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:30.406 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:30.406 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:30.406 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:30.665 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:31.010 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:31.268 [2024-07-15 13:44:18.777148] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:31.268 [2024-07-15 13:44:18.777188] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.268 [2024-07-15 13:44:18.777203] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x262aa50 00:21:31.268 [2024-07-15 13:44:18.777211] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.268 [2024-07-15 13:44:18.778476] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.268 [2024-07-15 13:44:18.778502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:31.268 [2024-07-15 13:44:18.778565] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:31.268 [2024-07-15 13:44:18.778587] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:31.268 [2024-07-15 13:44:18.778665] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:31.268 [2024-07-15 13:44:18.778718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:31.268 spare 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.268 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.268 [2024-07-15 13:44:18.879019] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x262b1f0 00:21:31.268 [2024-07-15 13:44:18.879042] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:31.268 [2024-07-15 13:44:18.879186] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b2090 00:21:31.268 [2024-07-15 13:44:18.879301] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x262b1f0 00:21:31.268 [2024-07-15 13:44:18.879312] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x262b1f0 00:21:31.268 [2024-07-15 13:44:18.879386] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:31.526 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.526 "name": "raid_bdev1", 00:21:31.526 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:31.526 "strip_size_kb": 0, 00:21:31.526 "state": "online", 00:21:31.526 "raid_level": "raid1", 00:21:31.526 "superblock": true, 00:21:31.526 "num_base_bdevs": 4, 00:21:31.526 "num_base_bdevs_discovered": 3, 00:21:31.526 "num_base_bdevs_operational": 3, 00:21:31.526 "base_bdevs_list": [ 00:21:31.526 { 00:21:31.526 "name": "spare", 00:21:31.526 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:31.526 "is_configured": true, 00:21:31.526 "data_offset": 2048, 00:21:31.526 "data_size": 63488 00:21:31.526 }, 00:21:31.526 { 00:21:31.526 "name": null, 00:21:31.526 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.526 "is_configured": false, 00:21:31.526 "data_offset": 2048, 00:21:31.526 "data_size": 63488 00:21:31.526 }, 00:21:31.526 { 00:21:31.526 "name": "BaseBdev3", 00:21:31.526 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:31.526 "is_configured": true, 00:21:31.526 "data_offset": 2048, 00:21:31.526 "data_size": 63488 00:21:31.526 }, 00:21:31.526 { 00:21:31.526 "name": "BaseBdev4", 00:21:31.526 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:31.526 "is_configured": true, 00:21:31.526 "data_offset": 2048, 00:21:31.526 "data_size": 63488 00:21:31.526 } 00:21:31.526 ] 00:21:31.526 }' 00:21:31.526 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.526 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:32.091 "name": "raid_bdev1", 00:21:32.091 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:32.091 "strip_size_kb": 0, 00:21:32.091 "state": "online", 00:21:32.091 "raid_level": "raid1", 00:21:32.091 "superblock": true, 00:21:32.091 "num_base_bdevs": 4, 00:21:32.091 "num_base_bdevs_discovered": 3, 00:21:32.091 "num_base_bdevs_operational": 3, 00:21:32.091 "base_bdevs_list": [ 00:21:32.091 { 00:21:32.091 "name": "spare", 00:21:32.091 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:32.091 "is_configured": true, 00:21:32.091 "data_offset": 2048, 00:21:32.091 "data_size": 63488 00:21:32.091 }, 00:21:32.091 { 00:21:32.091 "name": null, 00:21:32.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.091 "is_configured": false, 00:21:32.091 "data_offset": 2048, 00:21:32.091 "data_size": 63488 00:21:32.091 }, 00:21:32.091 { 00:21:32.091 "name": "BaseBdev3", 00:21:32.091 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:32.091 "is_configured": true, 00:21:32.091 "data_offset": 2048, 00:21:32.091 "data_size": 63488 00:21:32.091 }, 00:21:32.091 { 00:21:32.091 "name": "BaseBdev4", 00:21:32.091 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:32.091 "is_configured": true, 00:21:32.091 "data_offset": 2048, 00:21:32.091 "data_size": 63488 00:21:32.091 } 00:21:32.091 ] 00:21:32.091 }' 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:32.091 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:32.348 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:32.348 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.348 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:32.348 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:32.348 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:32.605 [2024-07-15 13:44:20.064660] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.605 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.863 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.863 "name": "raid_bdev1", 00:21:32.863 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:32.863 "strip_size_kb": 0, 00:21:32.863 "state": "online", 00:21:32.863 "raid_level": "raid1", 00:21:32.863 "superblock": true, 00:21:32.863 "num_base_bdevs": 4, 00:21:32.863 "num_base_bdevs_discovered": 2, 00:21:32.863 "num_base_bdevs_operational": 2, 00:21:32.863 "base_bdevs_list": [ 00:21:32.863 { 00:21:32.863 "name": null, 00:21:32.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.863 "is_configured": false, 00:21:32.863 "data_offset": 2048, 00:21:32.863 "data_size": 63488 00:21:32.863 }, 00:21:32.863 { 00:21:32.863 "name": null, 00:21:32.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.863 "is_configured": false, 00:21:32.863 "data_offset": 2048, 00:21:32.863 "data_size": 63488 00:21:32.863 }, 00:21:32.863 { 00:21:32.863 "name": "BaseBdev3", 00:21:32.863 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:32.863 "is_configured": true, 00:21:32.863 "data_offset": 2048, 00:21:32.863 "data_size": 63488 00:21:32.863 }, 00:21:32.863 { 00:21:32.863 "name": "BaseBdev4", 00:21:32.863 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:32.863 "is_configured": true, 00:21:32.863 "data_offset": 2048, 00:21:32.863 "data_size": 63488 00:21:32.863 } 00:21:32.863 ] 00:21:32.863 }' 00:21:32.863 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.863 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:33.427 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:33.427 [2024-07-15 13:44:20.918970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:33.427 [2024-07-15 13:44:20.919112] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:33.427 [2024-07-15 13:44:20.919124] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:33.427 [2024-07-15 13:44:20.919157] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:33.427 [2024-07-15 13:44:20.923170] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2187b40 00:21:33.427 [2024-07-15 13:44:20.924761] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:33.427 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:34.360 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:34.360 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:34.360 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:34.360 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:34.360 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:34.360 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.360 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:34.618 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:34.618 "name": "raid_bdev1", 00:21:34.618 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:34.618 "strip_size_kb": 0, 00:21:34.618 "state": "online", 00:21:34.618 "raid_level": "raid1", 00:21:34.618 "superblock": true, 00:21:34.618 "num_base_bdevs": 4, 00:21:34.618 "num_base_bdevs_discovered": 3, 00:21:34.618 "num_base_bdevs_operational": 3, 00:21:34.618 "process": { 00:21:34.618 "type": "rebuild", 00:21:34.618 "target": "spare", 00:21:34.618 "progress": { 00:21:34.618 "blocks": 22528, 00:21:34.618 "percent": 35 00:21:34.618 } 00:21:34.618 }, 00:21:34.618 "base_bdevs_list": [ 00:21:34.618 { 00:21:34.618 "name": "spare", 00:21:34.618 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:34.618 "is_configured": true, 00:21:34.618 "data_offset": 2048, 00:21:34.618 "data_size": 63488 00:21:34.618 }, 00:21:34.618 { 00:21:34.618 "name": null, 00:21:34.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:34.618 "is_configured": false, 00:21:34.618 "data_offset": 2048, 00:21:34.618 "data_size": 63488 00:21:34.618 }, 00:21:34.618 { 00:21:34.618 "name": "BaseBdev3", 00:21:34.618 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:34.618 "is_configured": true, 00:21:34.618 "data_offset": 2048, 00:21:34.618 "data_size": 63488 00:21:34.618 }, 00:21:34.618 { 00:21:34.618 "name": "BaseBdev4", 00:21:34.618 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:34.618 "is_configured": true, 00:21:34.618 "data_offset": 2048, 00:21:34.618 "data_size": 63488 00:21:34.618 } 00:21:34.618 ] 00:21:34.618 }' 00:21:34.618 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:34.618 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:34.618 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:34.618 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:34.618 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:34.876 [2024-07-15 13:44:22.347343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:34.876 [2024-07-15 13:44:22.435982] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:34.876 [2024-07-15 13:44:22.436024] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:34.876 [2024-07-15 13:44:22.436035] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:34.876 [2024-07-15 13:44:22.436040] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.876 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.133 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.133 "name": "raid_bdev1", 00:21:35.133 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:35.133 "strip_size_kb": 0, 00:21:35.133 "state": "online", 00:21:35.133 "raid_level": "raid1", 00:21:35.133 "superblock": true, 00:21:35.133 "num_base_bdevs": 4, 00:21:35.133 "num_base_bdevs_discovered": 2, 00:21:35.133 "num_base_bdevs_operational": 2, 00:21:35.133 "base_bdevs_list": [ 00:21:35.133 { 00:21:35.133 "name": null, 00:21:35.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.133 "is_configured": false, 00:21:35.133 "data_offset": 2048, 00:21:35.133 "data_size": 63488 00:21:35.133 }, 00:21:35.133 { 00:21:35.133 "name": null, 00:21:35.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.133 "is_configured": false, 00:21:35.133 "data_offset": 2048, 00:21:35.133 "data_size": 63488 00:21:35.133 }, 00:21:35.133 { 00:21:35.133 "name": "BaseBdev3", 00:21:35.134 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:35.134 "is_configured": true, 00:21:35.134 "data_offset": 2048, 00:21:35.134 "data_size": 63488 00:21:35.134 }, 00:21:35.134 { 00:21:35.134 "name": "BaseBdev4", 00:21:35.134 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:35.134 "is_configured": true, 00:21:35.134 "data_offset": 2048, 00:21:35.134 "data_size": 63488 00:21:35.134 } 00:21:35.134 ] 00:21:35.134 }' 00:21:35.134 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.134 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:35.697 13:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:35.697 [2024-07-15 13:44:23.290238] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:35.697 [2024-07-15 13:44:23.290284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:35.697 [2024-07-15 13:44:23.290301] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b1f10 00:21:35.697 [2024-07-15 13:44:23.290309] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:35.697 [2024-07-15 13:44:23.290599] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:35.697 [2024-07-15 13:44:23.290612] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:35.697 [2024-07-15 13:44:23.290676] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:35.697 [2024-07-15 13:44:23.290684] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:35.697 [2024-07-15 13:44:23.290691] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:35.697 [2024-07-15 13:44:23.290705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:35.697 [2024-07-15 13:44:23.294709] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b2a70 00:21:35.697 [2024-07-15 13:44:23.295806] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:35.697 spare 00:21:35.697 13:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:37.064 "name": "raid_bdev1", 00:21:37.064 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:37.064 "strip_size_kb": 0, 00:21:37.064 "state": "online", 00:21:37.064 "raid_level": "raid1", 00:21:37.064 "superblock": true, 00:21:37.064 "num_base_bdevs": 4, 00:21:37.064 "num_base_bdevs_discovered": 3, 00:21:37.064 "num_base_bdevs_operational": 3, 00:21:37.064 "process": { 00:21:37.064 "type": "rebuild", 00:21:37.064 "target": "spare", 00:21:37.064 "progress": { 00:21:37.064 "blocks": 22528, 00:21:37.064 "percent": 35 00:21:37.064 } 00:21:37.064 }, 00:21:37.064 "base_bdevs_list": [ 00:21:37.064 { 00:21:37.064 "name": "spare", 00:21:37.064 "uuid": "d14ca5fa-8bb4-5c21-a922-3a6c10e3f22d", 00:21:37.064 "is_configured": true, 00:21:37.064 "data_offset": 2048, 00:21:37.064 "data_size": 63488 00:21:37.064 }, 00:21:37.064 { 00:21:37.064 "name": null, 00:21:37.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.064 "is_configured": false, 00:21:37.064 "data_offset": 2048, 00:21:37.064 "data_size": 63488 00:21:37.064 }, 00:21:37.064 { 00:21:37.064 "name": "BaseBdev3", 00:21:37.064 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:37.064 "is_configured": true, 00:21:37.064 "data_offset": 2048, 00:21:37.064 "data_size": 63488 00:21:37.064 }, 00:21:37.064 { 00:21:37.064 "name": "BaseBdev4", 00:21:37.064 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:37.064 "is_configured": true, 00:21:37.064 "data_offset": 2048, 00:21:37.064 "data_size": 63488 00:21:37.064 } 00:21:37.064 ] 00:21:37.064 }' 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:37.064 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:37.320 [2024-07-15 13:44:24.740238] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:37.320 [2024-07-15 13:44:24.807061] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:37.320 [2024-07-15 13:44:24.807099] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:37.320 [2024-07-15 13:44:24.807111] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:37.320 [2024-07-15 13:44:24.807117] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.320 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.577 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.577 "name": "raid_bdev1", 00:21:37.577 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:37.577 "strip_size_kb": 0, 00:21:37.577 "state": "online", 00:21:37.577 "raid_level": "raid1", 00:21:37.577 "superblock": true, 00:21:37.577 "num_base_bdevs": 4, 00:21:37.577 "num_base_bdevs_discovered": 2, 00:21:37.577 "num_base_bdevs_operational": 2, 00:21:37.577 "base_bdevs_list": [ 00:21:37.577 { 00:21:37.577 "name": null, 00:21:37.577 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.577 "is_configured": false, 00:21:37.577 "data_offset": 2048, 00:21:37.577 "data_size": 63488 00:21:37.577 }, 00:21:37.577 { 00:21:37.577 "name": null, 00:21:37.577 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.577 "is_configured": false, 00:21:37.577 "data_offset": 2048, 00:21:37.577 "data_size": 63488 00:21:37.577 }, 00:21:37.577 { 00:21:37.577 "name": "BaseBdev3", 00:21:37.577 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:37.577 "is_configured": true, 00:21:37.577 "data_offset": 2048, 00:21:37.577 "data_size": 63488 00:21:37.577 }, 00:21:37.577 { 00:21:37.577 "name": "BaseBdev4", 00:21:37.577 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:37.577 "is_configured": true, 00:21:37.577 "data_offset": 2048, 00:21:37.577 "data_size": 63488 00:21:37.577 } 00:21:37.577 ] 00:21:37.577 }' 00:21:37.577 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.577 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:38.138 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:38.138 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:38.139 "name": "raid_bdev1", 00:21:38.139 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:38.139 "strip_size_kb": 0, 00:21:38.139 "state": "online", 00:21:38.139 "raid_level": "raid1", 00:21:38.139 "superblock": true, 00:21:38.139 "num_base_bdevs": 4, 00:21:38.139 "num_base_bdevs_discovered": 2, 00:21:38.139 "num_base_bdevs_operational": 2, 00:21:38.139 "base_bdevs_list": [ 00:21:38.139 { 00:21:38.139 "name": null, 00:21:38.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.139 "is_configured": false, 00:21:38.139 "data_offset": 2048, 00:21:38.139 "data_size": 63488 00:21:38.139 }, 00:21:38.139 { 00:21:38.139 "name": null, 00:21:38.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.139 "is_configured": false, 00:21:38.139 "data_offset": 2048, 00:21:38.139 "data_size": 63488 00:21:38.139 }, 00:21:38.139 { 00:21:38.139 "name": "BaseBdev3", 00:21:38.139 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:38.139 "is_configured": true, 00:21:38.139 "data_offset": 2048, 00:21:38.139 "data_size": 63488 00:21:38.139 }, 00:21:38.139 { 00:21:38.139 "name": "BaseBdev4", 00:21:38.139 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:38.139 "is_configured": true, 00:21:38.139 "data_offset": 2048, 00:21:38.139 "data_size": 63488 00:21:38.139 } 00:21:38.139 ] 00:21:38.139 }' 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:38.139 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:38.396 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:38.396 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:38.396 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:38.652 [2024-07-15 13:44:26.107079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:38.652 [2024-07-15 13:44:26.107120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.652 [2024-07-15 13:44:26.107137] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b2870 00:21:38.652 [2024-07-15 13:44:26.107146] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.652 [2024-07-15 13:44:26.107424] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.652 [2024-07-15 13:44:26.107443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:38.652 [2024-07-15 13:44:26.107495] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:38.652 [2024-07-15 13:44:26.107504] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:38.652 [2024-07-15 13:44:26.107512] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:38.652 BaseBdev1 00:21:38.652 13:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.582 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.839 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.839 "name": "raid_bdev1", 00:21:39.839 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:39.839 "strip_size_kb": 0, 00:21:39.839 "state": "online", 00:21:39.839 "raid_level": "raid1", 00:21:39.839 "superblock": true, 00:21:39.839 "num_base_bdevs": 4, 00:21:39.839 "num_base_bdevs_discovered": 2, 00:21:39.839 "num_base_bdevs_operational": 2, 00:21:39.839 "base_bdevs_list": [ 00:21:39.839 { 00:21:39.839 "name": null, 00:21:39.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.839 "is_configured": false, 00:21:39.839 "data_offset": 2048, 00:21:39.839 "data_size": 63488 00:21:39.839 }, 00:21:39.839 { 00:21:39.839 "name": null, 00:21:39.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.839 "is_configured": false, 00:21:39.839 "data_offset": 2048, 00:21:39.839 "data_size": 63488 00:21:39.839 }, 00:21:39.839 { 00:21:39.839 "name": "BaseBdev3", 00:21:39.839 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:39.839 "is_configured": true, 00:21:39.839 "data_offset": 2048, 00:21:39.839 "data_size": 63488 00:21:39.839 }, 00:21:39.839 { 00:21:39.839 "name": "BaseBdev4", 00:21:39.839 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:39.839 "is_configured": true, 00:21:39.840 "data_offset": 2048, 00:21:39.840 "data_size": 63488 00:21:39.840 } 00:21:39.840 ] 00:21:39.840 }' 00:21:39.840 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.840 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:40.404 "name": "raid_bdev1", 00:21:40.404 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:40.404 "strip_size_kb": 0, 00:21:40.404 "state": "online", 00:21:40.404 "raid_level": "raid1", 00:21:40.404 "superblock": true, 00:21:40.404 "num_base_bdevs": 4, 00:21:40.404 "num_base_bdevs_discovered": 2, 00:21:40.404 "num_base_bdevs_operational": 2, 00:21:40.404 "base_bdevs_list": [ 00:21:40.404 { 00:21:40.404 "name": null, 00:21:40.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.404 "is_configured": false, 00:21:40.404 "data_offset": 2048, 00:21:40.404 "data_size": 63488 00:21:40.404 }, 00:21:40.404 { 00:21:40.404 "name": null, 00:21:40.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.404 "is_configured": false, 00:21:40.404 "data_offset": 2048, 00:21:40.404 "data_size": 63488 00:21:40.404 }, 00:21:40.404 { 00:21:40.404 "name": "BaseBdev3", 00:21:40.404 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:40.404 "is_configured": true, 00:21:40.404 "data_offset": 2048, 00:21:40.404 "data_size": 63488 00:21:40.404 }, 00:21:40.404 { 00:21:40.404 "name": "BaseBdev4", 00:21:40.404 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:40.404 "is_configured": true, 00:21:40.404 "data_offset": 2048, 00:21:40.404 "data_size": 63488 00:21:40.404 } 00:21:40.404 ] 00:21:40.404 }' 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:40.404 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:40.661 [2024-07-15 13:44:28.208675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:40.661 [2024-07-15 13:44:28.208794] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:40.661 [2024-07-15 13:44:28.208807] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:40.661 request: 00:21:40.661 { 00:21:40.661 "base_bdev": "BaseBdev1", 00:21:40.661 "raid_bdev": "raid_bdev1", 00:21:40.661 "method": "bdev_raid_add_base_bdev", 00:21:40.661 "req_id": 1 00:21:40.661 } 00:21:40.661 Got JSON-RPC error response 00:21:40.661 response: 00:21:40.661 { 00:21:40.661 "code": -22, 00:21:40.661 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:40.661 } 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:40.661 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.032 "name": "raid_bdev1", 00:21:42.032 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:42.032 "strip_size_kb": 0, 00:21:42.032 "state": "online", 00:21:42.032 "raid_level": "raid1", 00:21:42.032 "superblock": true, 00:21:42.032 "num_base_bdevs": 4, 00:21:42.032 "num_base_bdevs_discovered": 2, 00:21:42.032 "num_base_bdevs_operational": 2, 00:21:42.032 "base_bdevs_list": [ 00:21:42.032 { 00:21:42.032 "name": null, 00:21:42.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.032 "is_configured": false, 00:21:42.032 "data_offset": 2048, 00:21:42.032 "data_size": 63488 00:21:42.032 }, 00:21:42.032 { 00:21:42.032 "name": null, 00:21:42.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.032 "is_configured": false, 00:21:42.032 "data_offset": 2048, 00:21:42.032 "data_size": 63488 00:21:42.032 }, 00:21:42.032 { 00:21:42.032 "name": "BaseBdev3", 00:21:42.032 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:42.032 "is_configured": true, 00:21:42.032 "data_offset": 2048, 00:21:42.032 "data_size": 63488 00:21:42.032 }, 00:21:42.032 { 00:21:42.032 "name": "BaseBdev4", 00:21:42.032 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:42.032 "is_configured": true, 00:21:42.032 "data_offset": 2048, 00:21:42.032 "data_size": 63488 00:21:42.032 } 00:21:42.032 ] 00:21:42.032 }' 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.032 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:42.289 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:42.289 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:42.289 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:42.289 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:42.289 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:42.289 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.289 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:42.546 "name": "raid_bdev1", 00:21:42.546 "uuid": "e1f2d40b-db4a-477c-951b-9bb4262b3cd2", 00:21:42.546 "strip_size_kb": 0, 00:21:42.546 "state": "online", 00:21:42.546 "raid_level": "raid1", 00:21:42.546 "superblock": true, 00:21:42.546 "num_base_bdevs": 4, 00:21:42.546 "num_base_bdevs_discovered": 2, 00:21:42.546 "num_base_bdevs_operational": 2, 00:21:42.546 "base_bdevs_list": [ 00:21:42.546 { 00:21:42.546 "name": null, 00:21:42.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.546 "is_configured": false, 00:21:42.546 "data_offset": 2048, 00:21:42.546 "data_size": 63488 00:21:42.546 }, 00:21:42.546 { 00:21:42.546 "name": null, 00:21:42.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.546 "is_configured": false, 00:21:42.546 "data_offset": 2048, 00:21:42.546 "data_size": 63488 00:21:42.546 }, 00:21:42.546 { 00:21:42.546 "name": "BaseBdev3", 00:21:42.546 "uuid": "abb9bfaf-fea9-56f3-854a-6f1d9e11e4bb", 00:21:42.546 "is_configured": true, 00:21:42.546 "data_offset": 2048, 00:21:42.546 "data_size": 63488 00:21:42.546 }, 00:21:42.546 { 00:21:42.546 "name": "BaseBdev4", 00:21:42.546 "uuid": "0adf6587-d794-583d-86db-5c957a5dfe18", 00:21:42.546 "is_configured": true, 00:21:42.546 "data_offset": 2048, 00:21:42.546 "data_size": 63488 00:21:42.546 } 00:21:42.546 ] 00:21:42.546 }' 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 81146 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 81146 ']' 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 81146 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:42.546 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 81146 00:21:42.803 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:42.803 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:42.803 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 81146' 00:21:42.803 killing process with pid 81146 00:21:42.803 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 81146 00:21:42.803 Received shutdown signal, test time was about 22.850002 seconds 00:21:42.803 00:21:42.803 Latency(us) 00:21:42.803 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:42.803 =================================================================================================================== 00:21:42.803 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:42.803 [2024-07-15 13:44:30.191159] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:42.803 [2024-07-15 13:44:30.191246] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:42.803 [2024-07-15 13:44:30.191294] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:42.803 [2024-07-15 13:44:30.191304] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x262b1f0 name raid_bdev1, state offline 00:21:42.803 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 81146 00:21:42.803 [2024-07-15 13:44:30.234464] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:43.061 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:43.061 00:21:43.061 real 0m27.267s 00:21:43.061 user 0m41.680s 00:21:43.061 sys 0m4.327s 00:21:43.061 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:43.061 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:43.061 ************************************ 00:21:43.061 END TEST raid_rebuild_test_sb_io 00:21:43.061 ************************************ 00:21:43.061 13:44:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:43.061 13:44:30 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:21:43.061 13:44:30 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:21:43.061 13:44:30 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:21:43.061 13:44:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:43.061 13:44:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:43.061 13:44:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:43.061 ************************************ 00:21:43.061 START TEST raid_state_function_test_sb_4k 00:21:43.061 ************************************ 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:43.061 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=85163 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 85163' 00:21:43.062 Process raid pid: 85163 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 85163 /var/tmp/spdk-raid.sock 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 85163 ']' 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:43.062 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:43.062 13:44:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:43.062 [2024-07-15 13:44:30.577230] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:21:43.062 [2024-07-15 13:44:30.577283] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:43.062 [2024-07-15 13:44:30.664205] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:43.319 [2024-07-15 13:44:30.756549] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:43.319 [2024-07-15 13:44:30.818022] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.319 [2024-07-15 13:44:30.818048] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.880 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:43.880 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:21:43.880 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:44.137 [2024-07-15 13:44:31.535579] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:44.137 [2024-07-15 13:44:31.535613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:44.137 [2024-07-15 13:44:31.535621] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:44.137 [2024-07-15 13:44:31.535629] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.137 "name": "Existed_Raid", 00:21:44.137 "uuid": "15094657-d4be-43bd-98b9-12d30a32bfa4", 00:21:44.137 "strip_size_kb": 0, 00:21:44.137 "state": "configuring", 00:21:44.137 "raid_level": "raid1", 00:21:44.137 "superblock": true, 00:21:44.137 "num_base_bdevs": 2, 00:21:44.137 "num_base_bdevs_discovered": 0, 00:21:44.137 "num_base_bdevs_operational": 2, 00:21:44.137 "base_bdevs_list": [ 00:21:44.137 { 00:21:44.137 "name": "BaseBdev1", 00:21:44.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.137 "is_configured": false, 00:21:44.137 "data_offset": 0, 00:21:44.137 "data_size": 0 00:21:44.137 }, 00:21:44.137 { 00:21:44.137 "name": "BaseBdev2", 00:21:44.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.137 "is_configured": false, 00:21:44.137 "data_offset": 0, 00:21:44.137 "data_size": 0 00:21:44.137 } 00:21:44.137 ] 00:21:44.137 }' 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.137 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:44.702 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:44.959 [2024-07-15 13:44:32.389697] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:44.959 [2024-07-15 13:44:32.389719] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26e4f30 name Existed_Raid, state configuring 00:21:44.959 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:44.959 [2024-07-15 13:44:32.570184] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:44.959 [2024-07-15 13:44:32.570210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:44.959 [2024-07-15 13:44:32.570217] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:44.959 [2024-07-15 13:44:32.570225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:21:45.217 [2024-07-15 13:44:32.743410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:45.217 BaseBdev1 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:45.217 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:45.475 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:45.475 [ 00:21:45.475 { 00:21:45.475 "name": "BaseBdev1", 00:21:45.475 "aliases": [ 00:21:45.475 "c315f40a-43ae-41ca-819b-933da7c62855" 00:21:45.475 ], 00:21:45.475 "product_name": "Malloc disk", 00:21:45.475 "block_size": 4096, 00:21:45.475 "num_blocks": 8192, 00:21:45.475 "uuid": "c315f40a-43ae-41ca-819b-933da7c62855", 00:21:45.475 "assigned_rate_limits": { 00:21:45.475 "rw_ios_per_sec": 0, 00:21:45.475 "rw_mbytes_per_sec": 0, 00:21:45.475 "r_mbytes_per_sec": 0, 00:21:45.475 "w_mbytes_per_sec": 0 00:21:45.475 }, 00:21:45.475 "claimed": true, 00:21:45.475 "claim_type": "exclusive_write", 00:21:45.475 "zoned": false, 00:21:45.475 "supported_io_types": { 00:21:45.475 "read": true, 00:21:45.475 "write": true, 00:21:45.475 "unmap": true, 00:21:45.475 "flush": true, 00:21:45.475 "reset": true, 00:21:45.475 "nvme_admin": false, 00:21:45.475 "nvme_io": false, 00:21:45.475 "nvme_io_md": false, 00:21:45.475 "write_zeroes": true, 00:21:45.475 "zcopy": true, 00:21:45.475 "get_zone_info": false, 00:21:45.475 "zone_management": false, 00:21:45.475 "zone_append": false, 00:21:45.475 "compare": false, 00:21:45.475 "compare_and_write": false, 00:21:45.475 "abort": true, 00:21:45.475 "seek_hole": false, 00:21:45.475 "seek_data": false, 00:21:45.475 "copy": true, 00:21:45.475 "nvme_iov_md": false 00:21:45.475 }, 00:21:45.475 "memory_domains": [ 00:21:45.475 { 00:21:45.475 "dma_device_id": "system", 00:21:45.475 "dma_device_type": 1 00:21:45.475 }, 00:21:45.475 { 00:21:45.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.475 "dma_device_type": 2 00:21:45.475 } 00:21:45.475 ], 00:21:45.475 "driver_specific": {} 00:21:45.475 } 00:21:45.475 ] 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.475 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.732 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.732 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.732 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.732 "name": "Existed_Raid", 00:21:45.732 "uuid": "88162379-1630-4476-a1ff-aa64302cb8ce", 00:21:45.732 "strip_size_kb": 0, 00:21:45.732 "state": "configuring", 00:21:45.732 "raid_level": "raid1", 00:21:45.732 "superblock": true, 00:21:45.732 "num_base_bdevs": 2, 00:21:45.732 "num_base_bdevs_discovered": 1, 00:21:45.732 "num_base_bdevs_operational": 2, 00:21:45.732 "base_bdevs_list": [ 00:21:45.732 { 00:21:45.732 "name": "BaseBdev1", 00:21:45.732 "uuid": "c315f40a-43ae-41ca-819b-933da7c62855", 00:21:45.732 "is_configured": true, 00:21:45.732 "data_offset": 256, 00:21:45.732 "data_size": 7936 00:21:45.732 }, 00:21:45.732 { 00:21:45.732 "name": "BaseBdev2", 00:21:45.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.732 "is_configured": false, 00:21:45.732 "data_offset": 0, 00:21:45.732 "data_size": 0 00:21:45.732 } 00:21:45.732 ] 00:21:45.732 }' 00:21:45.732 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.732 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:46.296 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:46.296 [2024-07-15 13:44:33.914503] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:46.296 [2024-07-15 13:44:33.914534] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26e4820 name Existed_Raid, state configuring 00:21:46.554 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:46.554 [2024-07-15 13:44:34.086967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:46.554 [2024-07-15 13:44:34.088018] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:46.554 [2024-07-15 13:44:34.088042] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.554 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:46.812 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.812 "name": "Existed_Raid", 00:21:46.812 "uuid": "f09afc62-2a8f-4fd3-bcfa-c9ddaa2c186f", 00:21:46.812 "strip_size_kb": 0, 00:21:46.812 "state": "configuring", 00:21:46.812 "raid_level": "raid1", 00:21:46.812 "superblock": true, 00:21:46.812 "num_base_bdevs": 2, 00:21:46.812 "num_base_bdevs_discovered": 1, 00:21:46.812 "num_base_bdevs_operational": 2, 00:21:46.812 "base_bdevs_list": [ 00:21:46.812 { 00:21:46.812 "name": "BaseBdev1", 00:21:46.812 "uuid": "c315f40a-43ae-41ca-819b-933da7c62855", 00:21:46.812 "is_configured": true, 00:21:46.812 "data_offset": 256, 00:21:46.812 "data_size": 7936 00:21:46.812 }, 00:21:46.812 { 00:21:46.812 "name": "BaseBdev2", 00:21:46.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.812 "is_configured": false, 00:21:46.812 "data_offset": 0, 00:21:46.812 "data_size": 0 00:21:46.812 } 00:21:46.812 ] 00:21:46.812 }' 00:21:46.812 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.812 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:21:47.377 [2024-07-15 13:44:34.932045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:47.377 [2024-07-15 13:44:34.932168] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26e5610 00:21:47.377 [2024-07-15 13:44:34.932178] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:47.377 [2024-07-15 13:44:34.932305] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2899150 00:21:47.377 [2024-07-15 13:44:34.932391] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26e5610 00:21:47.377 [2024-07-15 13:44:34.932398] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26e5610 00:21:47.377 [2024-07-15 13:44:34.932461] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:47.377 BaseBdev2 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:47.377 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:47.634 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:47.892 [ 00:21:47.892 { 00:21:47.892 "name": "BaseBdev2", 00:21:47.892 "aliases": [ 00:21:47.892 "08e12faf-36a6-483a-a449-ecf569a4f3bf" 00:21:47.892 ], 00:21:47.892 "product_name": "Malloc disk", 00:21:47.892 "block_size": 4096, 00:21:47.892 "num_blocks": 8192, 00:21:47.892 "uuid": "08e12faf-36a6-483a-a449-ecf569a4f3bf", 00:21:47.892 "assigned_rate_limits": { 00:21:47.892 "rw_ios_per_sec": 0, 00:21:47.892 "rw_mbytes_per_sec": 0, 00:21:47.892 "r_mbytes_per_sec": 0, 00:21:47.892 "w_mbytes_per_sec": 0 00:21:47.892 }, 00:21:47.892 "claimed": true, 00:21:47.892 "claim_type": "exclusive_write", 00:21:47.892 "zoned": false, 00:21:47.892 "supported_io_types": { 00:21:47.892 "read": true, 00:21:47.892 "write": true, 00:21:47.892 "unmap": true, 00:21:47.892 "flush": true, 00:21:47.892 "reset": true, 00:21:47.892 "nvme_admin": false, 00:21:47.892 "nvme_io": false, 00:21:47.892 "nvme_io_md": false, 00:21:47.892 "write_zeroes": true, 00:21:47.892 "zcopy": true, 00:21:47.892 "get_zone_info": false, 00:21:47.892 "zone_management": false, 00:21:47.892 "zone_append": false, 00:21:47.892 "compare": false, 00:21:47.892 "compare_and_write": false, 00:21:47.892 "abort": true, 00:21:47.892 "seek_hole": false, 00:21:47.892 "seek_data": false, 00:21:47.892 "copy": true, 00:21:47.892 "nvme_iov_md": false 00:21:47.892 }, 00:21:47.892 "memory_domains": [ 00:21:47.892 { 00:21:47.892 "dma_device_id": "system", 00:21:47.892 "dma_device_type": 1 00:21:47.892 }, 00:21:47.892 { 00:21:47.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.892 "dma_device_type": 2 00:21:47.892 } 00:21:47.892 ], 00:21:47.892 "driver_specific": {} 00:21:47.892 } 00:21:47.892 ] 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.892 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.892 "name": "Existed_Raid", 00:21:47.892 "uuid": "f09afc62-2a8f-4fd3-bcfa-c9ddaa2c186f", 00:21:47.892 "strip_size_kb": 0, 00:21:47.892 "state": "online", 00:21:47.892 "raid_level": "raid1", 00:21:47.892 "superblock": true, 00:21:47.892 "num_base_bdevs": 2, 00:21:47.892 "num_base_bdevs_discovered": 2, 00:21:47.892 "num_base_bdevs_operational": 2, 00:21:47.892 "base_bdevs_list": [ 00:21:47.892 { 00:21:47.892 "name": "BaseBdev1", 00:21:47.892 "uuid": "c315f40a-43ae-41ca-819b-933da7c62855", 00:21:47.892 "is_configured": true, 00:21:47.893 "data_offset": 256, 00:21:47.893 "data_size": 7936 00:21:47.893 }, 00:21:47.893 { 00:21:47.893 "name": "BaseBdev2", 00:21:47.893 "uuid": "08e12faf-36a6-483a-a449-ecf569a4f3bf", 00:21:47.893 "is_configured": true, 00:21:47.893 "data_offset": 256, 00:21:47.893 "data_size": 7936 00:21:47.893 } 00:21:47.893 ] 00:21:47.893 }' 00:21:47.893 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.893 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:48.457 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:48.457 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:48.457 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:48.457 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:48.457 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:48.457 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:48.457 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:48.457 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:48.714 [2024-07-15 13:44:36.155381] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:48.714 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:48.714 "name": "Existed_Raid", 00:21:48.714 "aliases": [ 00:21:48.714 "f09afc62-2a8f-4fd3-bcfa-c9ddaa2c186f" 00:21:48.714 ], 00:21:48.714 "product_name": "Raid Volume", 00:21:48.714 "block_size": 4096, 00:21:48.714 "num_blocks": 7936, 00:21:48.714 "uuid": "f09afc62-2a8f-4fd3-bcfa-c9ddaa2c186f", 00:21:48.714 "assigned_rate_limits": { 00:21:48.714 "rw_ios_per_sec": 0, 00:21:48.714 "rw_mbytes_per_sec": 0, 00:21:48.714 "r_mbytes_per_sec": 0, 00:21:48.714 "w_mbytes_per_sec": 0 00:21:48.714 }, 00:21:48.714 "claimed": false, 00:21:48.714 "zoned": false, 00:21:48.714 "supported_io_types": { 00:21:48.714 "read": true, 00:21:48.714 "write": true, 00:21:48.714 "unmap": false, 00:21:48.714 "flush": false, 00:21:48.714 "reset": true, 00:21:48.714 "nvme_admin": false, 00:21:48.714 "nvme_io": false, 00:21:48.714 "nvme_io_md": false, 00:21:48.714 "write_zeroes": true, 00:21:48.714 "zcopy": false, 00:21:48.714 "get_zone_info": false, 00:21:48.714 "zone_management": false, 00:21:48.714 "zone_append": false, 00:21:48.714 "compare": false, 00:21:48.714 "compare_and_write": false, 00:21:48.714 "abort": false, 00:21:48.714 "seek_hole": false, 00:21:48.714 "seek_data": false, 00:21:48.714 "copy": false, 00:21:48.714 "nvme_iov_md": false 00:21:48.714 }, 00:21:48.714 "memory_domains": [ 00:21:48.714 { 00:21:48.714 "dma_device_id": "system", 00:21:48.714 "dma_device_type": 1 00:21:48.714 }, 00:21:48.714 { 00:21:48.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.714 "dma_device_type": 2 00:21:48.714 }, 00:21:48.714 { 00:21:48.714 "dma_device_id": "system", 00:21:48.714 "dma_device_type": 1 00:21:48.714 }, 00:21:48.714 { 00:21:48.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.714 "dma_device_type": 2 00:21:48.714 } 00:21:48.714 ], 00:21:48.714 "driver_specific": { 00:21:48.714 "raid": { 00:21:48.714 "uuid": "f09afc62-2a8f-4fd3-bcfa-c9ddaa2c186f", 00:21:48.714 "strip_size_kb": 0, 00:21:48.714 "state": "online", 00:21:48.714 "raid_level": "raid1", 00:21:48.714 "superblock": true, 00:21:48.714 "num_base_bdevs": 2, 00:21:48.714 "num_base_bdevs_discovered": 2, 00:21:48.714 "num_base_bdevs_operational": 2, 00:21:48.714 "base_bdevs_list": [ 00:21:48.714 { 00:21:48.714 "name": "BaseBdev1", 00:21:48.714 "uuid": "c315f40a-43ae-41ca-819b-933da7c62855", 00:21:48.714 "is_configured": true, 00:21:48.714 "data_offset": 256, 00:21:48.714 "data_size": 7936 00:21:48.714 }, 00:21:48.714 { 00:21:48.714 "name": "BaseBdev2", 00:21:48.714 "uuid": "08e12faf-36a6-483a-a449-ecf569a4f3bf", 00:21:48.714 "is_configured": true, 00:21:48.714 "data_offset": 256, 00:21:48.714 "data_size": 7936 00:21:48.714 } 00:21:48.714 ] 00:21:48.714 } 00:21:48.714 } 00:21:48.714 }' 00:21:48.714 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:48.714 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:48.714 BaseBdev2' 00:21:48.714 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:48.714 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:48.714 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:48.971 "name": "BaseBdev1", 00:21:48.971 "aliases": [ 00:21:48.971 "c315f40a-43ae-41ca-819b-933da7c62855" 00:21:48.971 ], 00:21:48.971 "product_name": "Malloc disk", 00:21:48.971 "block_size": 4096, 00:21:48.971 "num_blocks": 8192, 00:21:48.971 "uuid": "c315f40a-43ae-41ca-819b-933da7c62855", 00:21:48.971 "assigned_rate_limits": { 00:21:48.971 "rw_ios_per_sec": 0, 00:21:48.971 "rw_mbytes_per_sec": 0, 00:21:48.971 "r_mbytes_per_sec": 0, 00:21:48.971 "w_mbytes_per_sec": 0 00:21:48.971 }, 00:21:48.971 "claimed": true, 00:21:48.971 "claim_type": "exclusive_write", 00:21:48.971 "zoned": false, 00:21:48.971 "supported_io_types": { 00:21:48.971 "read": true, 00:21:48.971 "write": true, 00:21:48.971 "unmap": true, 00:21:48.971 "flush": true, 00:21:48.971 "reset": true, 00:21:48.971 "nvme_admin": false, 00:21:48.971 "nvme_io": false, 00:21:48.971 "nvme_io_md": false, 00:21:48.971 "write_zeroes": true, 00:21:48.971 "zcopy": true, 00:21:48.971 "get_zone_info": false, 00:21:48.971 "zone_management": false, 00:21:48.971 "zone_append": false, 00:21:48.971 "compare": false, 00:21:48.971 "compare_and_write": false, 00:21:48.971 "abort": true, 00:21:48.971 "seek_hole": false, 00:21:48.971 "seek_data": false, 00:21:48.971 "copy": true, 00:21:48.971 "nvme_iov_md": false 00:21:48.971 }, 00:21:48.971 "memory_domains": [ 00:21:48.971 { 00:21:48.971 "dma_device_id": "system", 00:21:48.971 "dma_device_type": 1 00:21:48.971 }, 00:21:48.971 { 00:21:48.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.971 "dma_device_type": 2 00:21:48.971 } 00:21:48.971 ], 00:21:48.971 "driver_specific": {} 00:21:48.971 }' 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:48.971 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:49.228 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:49.485 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:49.485 "name": "BaseBdev2", 00:21:49.485 "aliases": [ 00:21:49.485 "08e12faf-36a6-483a-a449-ecf569a4f3bf" 00:21:49.485 ], 00:21:49.485 "product_name": "Malloc disk", 00:21:49.485 "block_size": 4096, 00:21:49.485 "num_blocks": 8192, 00:21:49.485 "uuid": "08e12faf-36a6-483a-a449-ecf569a4f3bf", 00:21:49.485 "assigned_rate_limits": { 00:21:49.485 "rw_ios_per_sec": 0, 00:21:49.485 "rw_mbytes_per_sec": 0, 00:21:49.485 "r_mbytes_per_sec": 0, 00:21:49.485 "w_mbytes_per_sec": 0 00:21:49.485 }, 00:21:49.485 "claimed": true, 00:21:49.486 "claim_type": "exclusive_write", 00:21:49.486 "zoned": false, 00:21:49.486 "supported_io_types": { 00:21:49.486 "read": true, 00:21:49.486 "write": true, 00:21:49.486 "unmap": true, 00:21:49.486 "flush": true, 00:21:49.486 "reset": true, 00:21:49.486 "nvme_admin": false, 00:21:49.486 "nvme_io": false, 00:21:49.486 "nvme_io_md": false, 00:21:49.486 "write_zeroes": true, 00:21:49.486 "zcopy": true, 00:21:49.486 "get_zone_info": false, 00:21:49.486 "zone_management": false, 00:21:49.486 "zone_append": false, 00:21:49.486 "compare": false, 00:21:49.486 "compare_and_write": false, 00:21:49.486 "abort": true, 00:21:49.486 "seek_hole": false, 00:21:49.486 "seek_data": false, 00:21:49.486 "copy": true, 00:21:49.486 "nvme_iov_md": false 00:21:49.486 }, 00:21:49.486 "memory_domains": [ 00:21:49.486 { 00:21:49.486 "dma_device_id": "system", 00:21:49.486 "dma_device_type": 1 00:21:49.486 }, 00:21:49.486 { 00:21:49.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.486 "dma_device_type": 2 00:21:49.486 } 00:21:49.486 ], 00:21:49.486 "driver_specific": {} 00:21:49.486 }' 00:21:49.486 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:49.486 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:49.486 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:49.486 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:49.486 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:49.486 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:49.486 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.486 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.743 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:49.743 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.743 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.743 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:49.743 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:50.000 [2024-07-15 13:44:37.374369] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.000 "name": "Existed_Raid", 00:21:50.000 "uuid": "f09afc62-2a8f-4fd3-bcfa-c9ddaa2c186f", 00:21:50.000 "strip_size_kb": 0, 00:21:50.000 "state": "online", 00:21:50.000 "raid_level": "raid1", 00:21:50.000 "superblock": true, 00:21:50.000 "num_base_bdevs": 2, 00:21:50.000 "num_base_bdevs_discovered": 1, 00:21:50.000 "num_base_bdevs_operational": 1, 00:21:50.000 "base_bdevs_list": [ 00:21:50.000 { 00:21:50.000 "name": null, 00:21:50.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.000 "is_configured": false, 00:21:50.000 "data_offset": 256, 00:21:50.000 "data_size": 7936 00:21:50.000 }, 00:21:50.000 { 00:21:50.000 "name": "BaseBdev2", 00:21:50.000 "uuid": "08e12faf-36a6-483a-a449-ecf569a4f3bf", 00:21:50.000 "is_configured": true, 00:21:50.000 "data_offset": 256, 00:21:50.000 "data_size": 7936 00:21:50.000 } 00:21:50.000 ] 00:21:50.000 }' 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.000 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:50.564 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:50.564 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:50.564 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:50.564 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.822 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:50.822 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:50.822 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:50.822 [2024-07-15 13:44:38.413867] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:50.822 [2024-07-15 13:44:38.413934] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:50.822 [2024-07-15 13:44:38.424247] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:50.822 [2024-07-15 13:44:38.424273] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:50.822 [2024-07-15 13:44:38.424281] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26e5610 name Existed_Raid, state offline 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 85163 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 85163 ']' 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 85163 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 85163 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 85163' 00:21:51.078 killing process with pid 85163 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 85163 00:21:51.078 [2024-07-15 13:44:38.674218] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:51.078 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 85163 00:21:51.078 [2024-07-15 13:44:38.675059] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:51.334 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:21:51.334 00:21:51.334 real 0m8.345s 00:21:51.334 user 0m14.631s 00:21:51.334 sys 0m1.648s 00:21:51.334 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:51.334 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:51.334 ************************************ 00:21:51.334 END TEST raid_state_function_test_sb_4k 00:21:51.334 ************************************ 00:21:51.334 13:44:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:51.334 13:44:38 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:21:51.334 13:44:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:51.334 13:44:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:51.334 13:44:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:51.591 ************************************ 00:21:51.591 START TEST raid_superblock_test_4k 00:21:51.591 ************************************ 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=86491 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 86491 /var/tmp/spdk-raid.sock 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 86491 ']' 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:51.591 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:51.591 13:44:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:51.591 [2024-07-15 13:44:39.018785] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:21:51.591 [2024-07-15 13:44:39.018843] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86491 ] 00:21:51.591 [2024-07-15 13:44:39.110536] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:51.591 [2024-07-15 13:44:39.192915] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:51.848 [2024-07-15 13:44:39.246263] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:51.848 [2024-07-15 13:44:39.246293] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:52.411 13:44:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:21:52.411 malloc1 00:21:52.411 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:52.667 [2024-07-15 13:44:40.153721] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:52.667 [2024-07-15 13:44:40.153771] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.667 [2024-07-15 13:44:40.153788] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2416260 00:21:52.667 [2024-07-15 13:44:40.153796] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.667 [2024-07-15 13:44:40.155048] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.667 [2024-07-15 13:44:40.155075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:52.667 pt1 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:52.667 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:21:52.923 malloc2 00:21:52.923 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:52.923 [2024-07-15 13:44:40.510729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:52.923 [2024-07-15 13:44:40.510771] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.923 [2024-07-15 13:44:40.510785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c0310 00:21:52.923 [2024-07-15 13:44:40.510794] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.923 [2024-07-15 13:44:40.511873] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.923 [2024-07-15 13:44:40.511900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:52.923 pt2 00:21:52.923 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:52.923 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:52.923 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:21:53.215 [2024-07-15 13:44:40.683181] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:53.215 [2024-07-15 13:44:40.684022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:53.215 [2024-07-15 13:44:40.684128] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25bf5b0 00:21:53.215 [2024-07-15 13:44:40.684137] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:53.215 [2024-07-15 13:44:40.684267] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c1a10 00:21:53.215 [2024-07-15 13:44:40.684366] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25bf5b0 00:21:53.215 [2024-07-15 13:44:40.684372] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25bf5b0 00:21:53.215 [2024-07-15 13:44:40.684445] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.215 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:53.498 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:53.498 "name": "raid_bdev1", 00:21:53.498 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:53.498 "strip_size_kb": 0, 00:21:53.498 "state": "online", 00:21:53.498 "raid_level": "raid1", 00:21:53.498 "superblock": true, 00:21:53.498 "num_base_bdevs": 2, 00:21:53.498 "num_base_bdevs_discovered": 2, 00:21:53.498 "num_base_bdevs_operational": 2, 00:21:53.498 "base_bdevs_list": [ 00:21:53.498 { 00:21:53.498 "name": "pt1", 00:21:53.498 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:53.498 "is_configured": true, 00:21:53.498 "data_offset": 256, 00:21:53.498 "data_size": 7936 00:21:53.498 }, 00:21:53.498 { 00:21:53.498 "name": "pt2", 00:21:53.498 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:53.498 "is_configured": true, 00:21:53.498 "data_offset": 256, 00:21:53.498 "data_size": 7936 00:21:53.498 } 00:21:53.498 ] 00:21:53.498 }' 00:21:53.498 13:44:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:53.498 13:44:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:54.062 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:54.062 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:54.063 [2024-07-15 13:44:41.533525] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:54.063 "name": "raid_bdev1", 00:21:54.063 "aliases": [ 00:21:54.063 "d72e3814-0cba-4520-9ea9-de597638fef4" 00:21:54.063 ], 00:21:54.063 "product_name": "Raid Volume", 00:21:54.063 "block_size": 4096, 00:21:54.063 "num_blocks": 7936, 00:21:54.063 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:54.063 "assigned_rate_limits": { 00:21:54.063 "rw_ios_per_sec": 0, 00:21:54.063 "rw_mbytes_per_sec": 0, 00:21:54.063 "r_mbytes_per_sec": 0, 00:21:54.063 "w_mbytes_per_sec": 0 00:21:54.063 }, 00:21:54.063 "claimed": false, 00:21:54.063 "zoned": false, 00:21:54.063 "supported_io_types": { 00:21:54.063 "read": true, 00:21:54.063 "write": true, 00:21:54.063 "unmap": false, 00:21:54.063 "flush": false, 00:21:54.063 "reset": true, 00:21:54.063 "nvme_admin": false, 00:21:54.063 "nvme_io": false, 00:21:54.063 "nvme_io_md": false, 00:21:54.063 "write_zeroes": true, 00:21:54.063 "zcopy": false, 00:21:54.063 "get_zone_info": false, 00:21:54.063 "zone_management": false, 00:21:54.063 "zone_append": false, 00:21:54.063 "compare": false, 00:21:54.063 "compare_and_write": false, 00:21:54.063 "abort": false, 00:21:54.063 "seek_hole": false, 00:21:54.063 "seek_data": false, 00:21:54.063 "copy": false, 00:21:54.063 "nvme_iov_md": false 00:21:54.063 }, 00:21:54.063 "memory_domains": [ 00:21:54.063 { 00:21:54.063 "dma_device_id": "system", 00:21:54.063 "dma_device_type": 1 00:21:54.063 }, 00:21:54.063 { 00:21:54.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.063 "dma_device_type": 2 00:21:54.063 }, 00:21:54.063 { 00:21:54.063 "dma_device_id": "system", 00:21:54.063 "dma_device_type": 1 00:21:54.063 }, 00:21:54.063 { 00:21:54.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.063 "dma_device_type": 2 00:21:54.063 } 00:21:54.063 ], 00:21:54.063 "driver_specific": { 00:21:54.063 "raid": { 00:21:54.063 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:54.063 "strip_size_kb": 0, 00:21:54.063 "state": "online", 00:21:54.063 "raid_level": "raid1", 00:21:54.063 "superblock": true, 00:21:54.063 "num_base_bdevs": 2, 00:21:54.063 "num_base_bdevs_discovered": 2, 00:21:54.063 "num_base_bdevs_operational": 2, 00:21:54.063 "base_bdevs_list": [ 00:21:54.063 { 00:21:54.063 "name": "pt1", 00:21:54.063 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:54.063 "is_configured": true, 00:21:54.063 "data_offset": 256, 00:21:54.063 "data_size": 7936 00:21:54.063 }, 00:21:54.063 { 00:21:54.063 "name": "pt2", 00:21:54.063 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:54.063 "is_configured": true, 00:21:54.063 "data_offset": 256, 00:21:54.063 "data_size": 7936 00:21:54.063 } 00:21:54.063 ] 00:21:54.063 } 00:21:54.063 } 00:21:54.063 }' 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:54.063 pt2' 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:54.063 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.321 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.321 "name": "pt1", 00:21:54.321 "aliases": [ 00:21:54.321 "00000000-0000-0000-0000-000000000001" 00:21:54.321 ], 00:21:54.321 "product_name": "passthru", 00:21:54.321 "block_size": 4096, 00:21:54.321 "num_blocks": 8192, 00:21:54.321 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:54.321 "assigned_rate_limits": { 00:21:54.322 "rw_ios_per_sec": 0, 00:21:54.322 "rw_mbytes_per_sec": 0, 00:21:54.322 "r_mbytes_per_sec": 0, 00:21:54.322 "w_mbytes_per_sec": 0 00:21:54.322 }, 00:21:54.322 "claimed": true, 00:21:54.322 "claim_type": "exclusive_write", 00:21:54.322 "zoned": false, 00:21:54.322 "supported_io_types": { 00:21:54.322 "read": true, 00:21:54.322 "write": true, 00:21:54.322 "unmap": true, 00:21:54.322 "flush": true, 00:21:54.322 "reset": true, 00:21:54.322 "nvme_admin": false, 00:21:54.322 "nvme_io": false, 00:21:54.322 "nvme_io_md": false, 00:21:54.322 "write_zeroes": true, 00:21:54.322 "zcopy": true, 00:21:54.322 "get_zone_info": false, 00:21:54.322 "zone_management": false, 00:21:54.322 "zone_append": false, 00:21:54.322 "compare": false, 00:21:54.322 "compare_and_write": false, 00:21:54.322 "abort": true, 00:21:54.322 "seek_hole": false, 00:21:54.322 "seek_data": false, 00:21:54.322 "copy": true, 00:21:54.322 "nvme_iov_md": false 00:21:54.322 }, 00:21:54.322 "memory_domains": [ 00:21:54.322 { 00:21:54.322 "dma_device_id": "system", 00:21:54.322 "dma_device_type": 1 00:21:54.322 }, 00:21:54.322 { 00:21:54.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.322 "dma_device_type": 2 00:21:54.322 } 00:21:54.322 ], 00:21:54.322 "driver_specific": { 00:21:54.322 "passthru": { 00:21:54.322 "name": "pt1", 00:21:54.322 "base_bdev_name": "malloc1" 00:21:54.322 } 00:21:54.322 } 00:21:54.322 }' 00:21:54.322 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.322 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.322 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:54.322 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.322 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.322 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.322 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.580 13:44:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.580 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.580 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.580 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.580 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.580 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.580 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:54.580 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.838 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.838 "name": "pt2", 00:21:54.838 "aliases": [ 00:21:54.838 "00000000-0000-0000-0000-000000000002" 00:21:54.838 ], 00:21:54.838 "product_name": "passthru", 00:21:54.838 "block_size": 4096, 00:21:54.838 "num_blocks": 8192, 00:21:54.838 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:54.838 "assigned_rate_limits": { 00:21:54.838 "rw_ios_per_sec": 0, 00:21:54.838 "rw_mbytes_per_sec": 0, 00:21:54.838 "r_mbytes_per_sec": 0, 00:21:54.838 "w_mbytes_per_sec": 0 00:21:54.838 }, 00:21:54.839 "claimed": true, 00:21:54.839 "claim_type": "exclusive_write", 00:21:54.839 "zoned": false, 00:21:54.839 "supported_io_types": { 00:21:54.839 "read": true, 00:21:54.839 "write": true, 00:21:54.839 "unmap": true, 00:21:54.839 "flush": true, 00:21:54.839 "reset": true, 00:21:54.839 "nvme_admin": false, 00:21:54.839 "nvme_io": false, 00:21:54.839 "nvme_io_md": false, 00:21:54.839 "write_zeroes": true, 00:21:54.839 "zcopy": true, 00:21:54.839 "get_zone_info": false, 00:21:54.839 "zone_management": false, 00:21:54.839 "zone_append": false, 00:21:54.839 "compare": false, 00:21:54.839 "compare_and_write": false, 00:21:54.839 "abort": true, 00:21:54.839 "seek_hole": false, 00:21:54.839 "seek_data": false, 00:21:54.839 "copy": true, 00:21:54.839 "nvme_iov_md": false 00:21:54.839 }, 00:21:54.839 "memory_domains": [ 00:21:54.839 { 00:21:54.839 "dma_device_id": "system", 00:21:54.839 "dma_device_type": 1 00:21:54.839 }, 00:21:54.839 { 00:21:54.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.839 "dma_device_type": 2 00:21:54.839 } 00:21:54.839 ], 00:21:54.839 "driver_specific": { 00:21:54.839 "passthru": { 00:21:54.839 "name": "pt2", 00:21:54.839 "base_bdev_name": "malloc2" 00:21:54.839 } 00:21:54.839 } 00:21:54.839 }' 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.839 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.097 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.097 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.097 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.097 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.097 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:55.097 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:55.355 [2024-07-15 13:44:42.716597] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:55.355 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d72e3814-0cba-4520-9ea9-de597638fef4 00:21:55.355 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z d72e3814-0cba-4520-9ea9-de597638fef4 ']' 00:21:55.355 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:55.356 [2024-07-15 13:44:42.892877] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:55.356 [2024-07-15 13:44:42.892893] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:55.356 [2024-07-15 13:44:42.892938] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:55.356 [2024-07-15 13:44:42.892976] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:55.356 [2024-07-15 13:44:42.892984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25bf5b0 name raid_bdev1, state offline 00:21:55.356 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.356 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:55.613 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:55.613 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:55.613 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:55.613 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:55.872 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:55.872 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:55.872 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:55.872 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:56.130 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:56.388 [2024-07-15 13:44:43.803204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:56.388 [2024-07-15 13:44:43.804204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:56.388 [2024-07-15 13:44:43.804248] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:56.389 [2024-07-15 13:44:43.804286] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:56.389 [2024-07-15 13:44:43.804299] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:56.389 [2024-07-15 13:44:43.804305] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c0d40 name raid_bdev1, state configuring 00:21:56.389 request: 00:21:56.389 { 00:21:56.389 "name": "raid_bdev1", 00:21:56.389 "raid_level": "raid1", 00:21:56.389 "base_bdevs": [ 00:21:56.389 "malloc1", 00:21:56.389 "malloc2" 00:21:56.389 ], 00:21:56.389 "superblock": false, 00:21:56.389 "method": "bdev_raid_create", 00:21:56.389 "req_id": 1 00:21:56.389 } 00:21:56.389 Got JSON-RPC error response 00:21:56.389 response: 00:21:56.389 { 00:21:56.389 "code": -17, 00:21:56.389 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:56.389 } 00:21:56.389 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:21:56.389 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:56.389 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:56.389 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:56.389 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:56.389 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.389 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:56.389 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:56.389 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:56.646 [2024-07-15 13:44:44.148044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:56.646 [2024-07-15 13:44:44.148077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.646 [2024-07-15 13:44:44.148090] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2414ba0 00:21:56.646 [2024-07-15 13:44:44.148098] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.646 [2024-07-15 13:44:44.149249] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.646 [2024-07-15 13:44:44.149271] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:56.646 [2024-07-15 13:44:44.149318] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:56.646 [2024-07-15 13:44:44.149338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:56.646 pt1 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.646 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.647 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.905 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:56.905 "name": "raid_bdev1", 00:21:56.905 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:56.905 "strip_size_kb": 0, 00:21:56.905 "state": "configuring", 00:21:56.905 "raid_level": "raid1", 00:21:56.905 "superblock": true, 00:21:56.905 "num_base_bdevs": 2, 00:21:56.905 "num_base_bdevs_discovered": 1, 00:21:56.905 "num_base_bdevs_operational": 2, 00:21:56.905 "base_bdevs_list": [ 00:21:56.905 { 00:21:56.905 "name": "pt1", 00:21:56.905 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:56.905 "is_configured": true, 00:21:56.905 "data_offset": 256, 00:21:56.905 "data_size": 7936 00:21:56.905 }, 00:21:56.905 { 00:21:56.905 "name": null, 00:21:56.905 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:56.905 "is_configured": false, 00:21:56.905 "data_offset": 256, 00:21:56.905 "data_size": 7936 00:21:56.905 } 00:21:56.905 ] 00:21:56.905 }' 00:21:56.905 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:56.905 13:44:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:57.472 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:21:57.472 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:57.472 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:57.472 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:57.472 [2024-07-15 13:44:45.006279] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:57.472 [2024-07-15 13:44:45.006318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:57.472 [2024-07-15 13:44:45.006332] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2416e80 00:21:57.473 [2024-07-15 13:44:45.006341] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:57.473 [2024-07-15 13:44:45.006602] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:57.473 [2024-07-15 13:44:45.006617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:57.473 [2024-07-15 13:44:45.006662] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:57.473 [2024-07-15 13:44:45.006677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:57.473 [2024-07-15 13:44:45.006752] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c2920 00:21:57.473 [2024-07-15 13:44:45.006760] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:57.473 [2024-07-15 13:44:45.006880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c6110 00:21:57.473 [2024-07-15 13:44:45.006973] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c2920 00:21:57.473 [2024-07-15 13:44:45.006980] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25c2920 00:21:57.473 [2024-07-15 13:44:45.007279] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:57.473 pt2 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.473 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.732 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.732 "name": "raid_bdev1", 00:21:57.732 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:57.732 "strip_size_kb": 0, 00:21:57.732 "state": "online", 00:21:57.732 "raid_level": "raid1", 00:21:57.732 "superblock": true, 00:21:57.732 "num_base_bdevs": 2, 00:21:57.732 "num_base_bdevs_discovered": 2, 00:21:57.732 "num_base_bdevs_operational": 2, 00:21:57.732 "base_bdevs_list": [ 00:21:57.732 { 00:21:57.732 "name": "pt1", 00:21:57.732 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:57.732 "is_configured": true, 00:21:57.732 "data_offset": 256, 00:21:57.732 "data_size": 7936 00:21:57.732 }, 00:21:57.732 { 00:21:57.732 "name": "pt2", 00:21:57.732 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:57.732 "is_configured": true, 00:21:57.732 "data_offset": 256, 00:21:57.732 "data_size": 7936 00:21:57.732 } 00:21:57.732 ] 00:21:57.732 }' 00:21:57.732 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.732 13:44:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:58.299 [2024-07-15 13:44:45.868658] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:58.299 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:58.299 "name": "raid_bdev1", 00:21:58.299 "aliases": [ 00:21:58.299 "d72e3814-0cba-4520-9ea9-de597638fef4" 00:21:58.299 ], 00:21:58.299 "product_name": "Raid Volume", 00:21:58.299 "block_size": 4096, 00:21:58.299 "num_blocks": 7936, 00:21:58.299 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:58.299 "assigned_rate_limits": { 00:21:58.299 "rw_ios_per_sec": 0, 00:21:58.299 "rw_mbytes_per_sec": 0, 00:21:58.299 "r_mbytes_per_sec": 0, 00:21:58.299 "w_mbytes_per_sec": 0 00:21:58.299 }, 00:21:58.299 "claimed": false, 00:21:58.299 "zoned": false, 00:21:58.299 "supported_io_types": { 00:21:58.299 "read": true, 00:21:58.299 "write": true, 00:21:58.299 "unmap": false, 00:21:58.299 "flush": false, 00:21:58.299 "reset": true, 00:21:58.299 "nvme_admin": false, 00:21:58.299 "nvme_io": false, 00:21:58.299 "nvme_io_md": false, 00:21:58.299 "write_zeroes": true, 00:21:58.299 "zcopy": false, 00:21:58.299 "get_zone_info": false, 00:21:58.299 "zone_management": false, 00:21:58.299 "zone_append": false, 00:21:58.299 "compare": false, 00:21:58.299 "compare_and_write": false, 00:21:58.299 "abort": false, 00:21:58.299 "seek_hole": false, 00:21:58.299 "seek_data": false, 00:21:58.299 "copy": false, 00:21:58.299 "nvme_iov_md": false 00:21:58.299 }, 00:21:58.299 "memory_domains": [ 00:21:58.299 { 00:21:58.299 "dma_device_id": "system", 00:21:58.299 "dma_device_type": 1 00:21:58.299 }, 00:21:58.299 { 00:21:58.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.299 "dma_device_type": 2 00:21:58.299 }, 00:21:58.299 { 00:21:58.299 "dma_device_id": "system", 00:21:58.299 "dma_device_type": 1 00:21:58.299 }, 00:21:58.299 { 00:21:58.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.299 "dma_device_type": 2 00:21:58.299 } 00:21:58.299 ], 00:21:58.299 "driver_specific": { 00:21:58.299 "raid": { 00:21:58.299 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:58.299 "strip_size_kb": 0, 00:21:58.299 "state": "online", 00:21:58.299 "raid_level": "raid1", 00:21:58.299 "superblock": true, 00:21:58.299 "num_base_bdevs": 2, 00:21:58.299 "num_base_bdevs_discovered": 2, 00:21:58.299 "num_base_bdevs_operational": 2, 00:21:58.299 "base_bdevs_list": [ 00:21:58.299 { 00:21:58.299 "name": "pt1", 00:21:58.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:58.299 "is_configured": true, 00:21:58.299 "data_offset": 256, 00:21:58.300 "data_size": 7936 00:21:58.300 }, 00:21:58.300 { 00:21:58.300 "name": "pt2", 00:21:58.300 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:58.300 "is_configured": true, 00:21:58.300 "data_offset": 256, 00:21:58.300 "data_size": 7936 00:21:58.300 } 00:21:58.300 ] 00:21:58.300 } 00:21:58.300 } 00:21:58.300 }' 00:21:58.300 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:58.558 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:58.558 pt2' 00:21:58.558 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:58.558 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:58.558 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:58.558 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:58.558 "name": "pt1", 00:21:58.558 "aliases": [ 00:21:58.558 "00000000-0000-0000-0000-000000000001" 00:21:58.558 ], 00:21:58.558 "product_name": "passthru", 00:21:58.558 "block_size": 4096, 00:21:58.558 "num_blocks": 8192, 00:21:58.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:58.558 "assigned_rate_limits": { 00:21:58.558 "rw_ios_per_sec": 0, 00:21:58.558 "rw_mbytes_per_sec": 0, 00:21:58.558 "r_mbytes_per_sec": 0, 00:21:58.558 "w_mbytes_per_sec": 0 00:21:58.558 }, 00:21:58.558 "claimed": true, 00:21:58.558 "claim_type": "exclusive_write", 00:21:58.558 "zoned": false, 00:21:58.558 "supported_io_types": { 00:21:58.558 "read": true, 00:21:58.558 "write": true, 00:21:58.558 "unmap": true, 00:21:58.558 "flush": true, 00:21:58.558 "reset": true, 00:21:58.558 "nvme_admin": false, 00:21:58.558 "nvme_io": false, 00:21:58.558 "nvme_io_md": false, 00:21:58.558 "write_zeroes": true, 00:21:58.558 "zcopy": true, 00:21:58.558 "get_zone_info": false, 00:21:58.558 "zone_management": false, 00:21:58.558 "zone_append": false, 00:21:58.558 "compare": false, 00:21:58.558 "compare_and_write": false, 00:21:58.558 "abort": true, 00:21:58.558 "seek_hole": false, 00:21:58.558 "seek_data": false, 00:21:58.558 "copy": true, 00:21:58.558 "nvme_iov_md": false 00:21:58.558 }, 00:21:58.558 "memory_domains": [ 00:21:58.558 { 00:21:58.558 "dma_device_id": "system", 00:21:58.558 "dma_device_type": 1 00:21:58.558 }, 00:21:58.558 { 00:21:58.558 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.558 "dma_device_type": 2 00:21:58.558 } 00:21:58.558 ], 00:21:58.558 "driver_specific": { 00:21:58.558 "passthru": { 00:21:58.558 "name": "pt1", 00:21:58.558 "base_bdev_name": "malloc1" 00:21:58.558 } 00:21:58.558 } 00:21:58.558 }' 00:21:58.558 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.558 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.816 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:58.816 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.816 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.816 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:58.817 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:59.075 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:59.075 "name": "pt2", 00:21:59.075 "aliases": [ 00:21:59.075 "00000000-0000-0000-0000-000000000002" 00:21:59.075 ], 00:21:59.075 "product_name": "passthru", 00:21:59.075 "block_size": 4096, 00:21:59.075 "num_blocks": 8192, 00:21:59.075 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:59.075 "assigned_rate_limits": { 00:21:59.075 "rw_ios_per_sec": 0, 00:21:59.075 "rw_mbytes_per_sec": 0, 00:21:59.075 "r_mbytes_per_sec": 0, 00:21:59.075 "w_mbytes_per_sec": 0 00:21:59.075 }, 00:21:59.075 "claimed": true, 00:21:59.075 "claim_type": "exclusive_write", 00:21:59.075 "zoned": false, 00:21:59.075 "supported_io_types": { 00:21:59.075 "read": true, 00:21:59.075 "write": true, 00:21:59.075 "unmap": true, 00:21:59.075 "flush": true, 00:21:59.075 "reset": true, 00:21:59.075 "nvme_admin": false, 00:21:59.075 "nvme_io": false, 00:21:59.075 "nvme_io_md": false, 00:21:59.075 "write_zeroes": true, 00:21:59.075 "zcopy": true, 00:21:59.075 "get_zone_info": false, 00:21:59.075 "zone_management": false, 00:21:59.075 "zone_append": false, 00:21:59.075 "compare": false, 00:21:59.075 "compare_and_write": false, 00:21:59.075 "abort": true, 00:21:59.075 "seek_hole": false, 00:21:59.075 "seek_data": false, 00:21:59.075 "copy": true, 00:21:59.075 "nvme_iov_md": false 00:21:59.075 }, 00:21:59.075 "memory_domains": [ 00:21:59.075 { 00:21:59.075 "dma_device_id": "system", 00:21:59.075 "dma_device_type": 1 00:21:59.075 }, 00:21:59.075 { 00:21:59.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.075 "dma_device_type": 2 00:21:59.075 } 00:21:59.075 ], 00:21:59.075 "driver_specific": { 00:21:59.075 "passthru": { 00:21:59.076 "name": "pt2", 00:21:59.076 "base_bdev_name": "malloc2" 00:21:59.076 } 00:21:59.076 } 00:21:59.076 }' 00:21:59.076 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.076 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.076 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:59.076 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:59.335 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:59.593 [2024-07-15 13:44:47.051700] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:59.593 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' d72e3814-0cba-4520-9ea9-de597638fef4 '!=' d72e3814-0cba-4520-9ea9-de597638fef4 ']' 00:21:59.593 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:59.593 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:59.593 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:21:59.593 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:59.946 [2024-07-15 13:44:47.219990] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.946 "name": "raid_bdev1", 00:21:59.946 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:21:59.946 "strip_size_kb": 0, 00:21:59.946 "state": "online", 00:21:59.946 "raid_level": "raid1", 00:21:59.946 "superblock": true, 00:21:59.946 "num_base_bdevs": 2, 00:21:59.946 "num_base_bdevs_discovered": 1, 00:21:59.946 "num_base_bdevs_operational": 1, 00:21:59.946 "base_bdevs_list": [ 00:21:59.946 { 00:21:59.946 "name": null, 00:21:59.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.946 "is_configured": false, 00:21:59.946 "data_offset": 256, 00:21:59.946 "data_size": 7936 00:21:59.946 }, 00:21:59.946 { 00:21:59.946 "name": "pt2", 00:21:59.946 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:59.946 "is_configured": true, 00:21:59.946 "data_offset": 256, 00:21:59.946 "data_size": 7936 00:21:59.946 } 00:21:59.946 ] 00:21:59.946 }' 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.946 13:44:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:00.513 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:00.513 [2024-07-15 13:44:48.046104] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:00.513 [2024-07-15 13:44:48.046129] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:00.513 [2024-07-15 13:44:48.046165] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:00.513 [2024-07-15 13:44:48.046194] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:00.513 [2024-07-15 13:44:48.046202] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c2920 name raid_bdev1, state offline 00:22:00.513 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.513 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:00.772 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:00.772 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:00.772 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:00.772 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:00.772 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:01.031 [2024-07-15 13:44:48.579470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:01.031 [2024-07-15 13:44:48.579505] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:01.031 [2024-07-15 13:44:48.579522] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2416490 00:22:01.031 [2024-07-15 13:44:48.579532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:01.031 [2024-07-15 13:44:48.580813] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:01.031 [2024-07-15 13:44:48.580839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:01.031 [2024-07-15 13:44:48.580894] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:01.031 [2024-07-15 13:44:48.580917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:01.031 [2024-07-15 13:44:48.580991] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c7360 00:22:01.031 [2024-07-15 13:44:48.581008] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:01.031 [2024-07-15 13:44:48.581142] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2417860 00:22:01.031 [2024-07-15 13:44:48.581240] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c7360 00:22:01.031 [2024-07-15 13:44:48.581247] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25c7360 00:22:01.031 [2024-07-15 13:44:48.581323] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:01.031 pt2 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.031 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.290 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.290 "name": "raid_bdev1", 00:22:01.290 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:22:01.290 "strip_size_kb": 0, 00:22:01.290 "state": "online", 00:22:01.290 "raid_level": "raid1", 00:22:01.290 "superblock": true, 00:22:01.290 "num_base_bdevs": 2, 00:22:01.290 "num_base_bdevs_discovered": 1, 00:22:01.290 "num_base_bdevs_operational": 1, 00:22:01.290 "base_bdevs_list": [ 00:22:01.290 { 00:22:01.290 "name": null, 00:22:01.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.290 "is_configured": false, 00:22:01.290 "data_offset": 256, 00:22:01.290 "data_size": 7936 00:22:01.290 }, 00:22:01.290 { 00:22:01.290 "name": "pt2", 00:22:01.290 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:01.290 "is_configured": true, 00:22:01.290 "data_offset": 256, 00:22:01.290 "data_size": 7936 00:22:01.290 } 00:22:01.290 ] 00:22:01.290 }' 00:22:01.290 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.290 13:44:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:01.857 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:01.857 [2024-07-15 13:44:49.437668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:01.857 [2024-07-15 13:44:49.437690] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:01.857 [2024-07-15 13:44:49.437731] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:01.857 [2024-07-15 13:44:49.437764] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:01.857 [2024-07-15 13:44:49.437772] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c7360 name raid_bdev1, state offline 00:22:01.857 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.857 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:02.115 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:02.115 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:02.115 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:22:02.115 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:02.375 [2024-07-15 13:44:49.798595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:02.375 [2024-07-15 13:44:49.798632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.375 [2024-07-15 13:44:49.798646] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c0540 00:22:02.375 [2024-07-15 13:44:49.798654] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.375 [2024-07-15 13:44:49.799858] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.375 [2024-07-15 13:44:49.799879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:02.375 [2024-07-15 13:44:49.799929] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:02.375 [2024-07-15 13:44:49.799949] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:02.375 [2024-07-15 13:44:49.800028] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:02.375 [2024-07-15 13:44:49.800037] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:02.375 [2024-07-15 13:44:49.800047] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c5230 name raid_bdev1, state configuring 00:22:02.375 [2024-07-15 13:44:49.800063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:02.375 [2024-07-15 13:44:49.800104] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c5d40 00:22:02.375 [2024-07-15 13:44:49.800111] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:02.375 [2024-07-15 13:44:49.800226] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c1020 00:22:02.375 [2024-07-15 13:44:49.800309] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c5d40 00:22:02.375 [2024-07-15 13:44:49.800316] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25c5d40 00:22:02.375 [2024-07-15 13:44:49.800382] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:02.375 pt1 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.375 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:02.633 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:02.633 "name": "raid_bdev1", 00:22:02.633 "uuid": "d72e3814-0cba-4520-9ea9-de597638fef4", 00:22:02.633 "strip_size_kb": 0, 00:22:02.633 "state": "online", 00:22:02.633 "raid_level": "raid1", 00:22:02.633 "superblock": true, 00:22:02.633 "num_base_bdevs": 2, 00:22:02.633 "num_base_bdevs_discovered": 1, 00:22:02.633 "num_base_bdevs_operational": 1, 00:22:02.633 "base_bdevs_list": [ 00:22:02.633 { 00:22:02.633 "name": null, 00:22:02.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.633 "is_configured": false, 00:22:02.633 "data_offset": 256, 00:22:02.633 "data_size": 7936 00:22:02.633 }, 00:22:02.633 { 00:22:02.633 "name": "pt2", 00:22:02.633 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:02.633 "is_configured": true, 00:22:02.633 "data_offset": 256, 00:22:02.633 "data_size": 7936 00:22:02.633 } 00:22:02.633 ] 00:22:02.633 }' 00:22:02.633 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:02.633 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:02.891 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:02.891 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:03.150 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:03.150 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:03.150 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:03.410 [2024-07-15 13:44:50.825392] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' d72e3814-0cba-4520-9ea9-de597638fef4 '!=' d72e3814-0cba-4520-9ea9-de597638fef4 ']' 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 86491 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 86491 ']' 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 86491 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 86491 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 86491' 00:22:03.410 killing process with pid 86491 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 86491 00:22:03.410 [2024-07-15 13:44:50.896037] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:03.410 [2024-07-15 13:44:50.896084] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:03.410 [2024-07-15 13:44:50.896127] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:03.410 [2024-07-15 13:44:50.896135] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c5d40 name raid_bdev1, state offline 00:22:03.410 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 86491 00:22:03.410 [2024-07-15 13:44:50.912282] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:03.669 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:22:03.669 00:22:03.669 real 0m12.154s 00:22:03.669 user 0m21.875s 00:22:03.669 sys 0m2.408s 00:22:03.669 13:44:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:03.669 13:44:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:03.669 ************************************ 00:22:03.669 END TEST raid_superblock_test_4k 00:22:03.669 ************************************ 00:22:03.669 13:44:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:03.669 13:44:51 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:22:03.669 13:44:51 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:22:03.669 13:44:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:03.669 13:44:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:03.669 13:44:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:03.669 ************************************ 00:22:03.669 START TEST raid_rebuild_test_sb_4k 00:22:03.669 ************************************ 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=88414 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 88414 /var/tmp/spdk-raid.sock 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 88414 ']' 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:03.669 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:03.669 13:44:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:03.669 [2024-07-15 13:44:51.240123] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:22:03.669 [2024-07-15 13:44:51.240169] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid88414 ] 00:22:03.669 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:03.669 Zero copy mechanism will not be used. 00:22:03.928 [2024-07-15 13:44:51.325482] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:03.928 [2024-07-15 13:44:51.417438] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:03.928 [2024-07-15 13:44:51.481056] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:03.928 [2024-07-15 13:44:51.481086] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:04.495 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:04.496 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:22:04.496 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:04.496 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:22:04.754 BaseBdev1_malloc 00:22:04.754 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:04.754 [2024-07-15 13:44:52.367552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:04.754 [2024-07-15 13:44:52.367590] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:04.754 [2024-07-15 13:44:52.367625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb3c600 00:22:04.754 [2024-07-15 13:44:52.367634] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:04.754 [2024-07-15 13:44:52.368916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:04.754 [2024-07-15 13:44:52.368938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:04.754 BaseBdev1 00:22:05.014 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:05.014 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:22:05.014 BaseBdev2_malloc 00:22:05.014 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:05.273 [2024-07-15 13:44:52.708533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:05.273 [2024-07-15 13:44:52.708568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:05.273 [2024-07-15 13:44:52.708604] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb3d120 00:22:05.273 [2024-07-15 13:44:52.708612] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:05.273 [2024-07-15 13:44:52.709774] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:05.273 [2024-07-15 13:44:52.709798] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:05.273 BaseBdev2 00:22:05.273 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:22:05.273 spare_malloc 00:22:05.532 13:44:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:05.532 spare_delay 00:22:05.532 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:05.792 [2024-07-15 13:44:53.209443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:05.792 [2024-07-15 13:44:53.209478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:05.792 [2024-07-15 13:44:53.209514] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xceb780 00:22:05.792 [2024-07-15 13:44:53.209523] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:05.792 [2024-07-15 13:44:53.210711] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:05.792 [2024-07-15 13:44:53.210733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:05.792 spare 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:05.792 [2024-07-15 13:44:53.373900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:05.792 [2024-07-15 13:44:53.374909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:05.792 [2024-07-15 13:44:53.375043] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcec930 00:22:05.792 [2024-07-15 13:44:53.375053] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:05.792 [2024-07-15 13:44:53.375195] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce5d50 00:22:05.792 [2024-07-15 13:44:53.375297] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcec930 00:22:05.792 [2024-07-15 13:44:53.375303] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcec930 00:22:05.792 [2024-07-15 13:44:53.375376] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.792 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.051 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.051 "name": "raid_bdev1", 00:22:06.051 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:06.051 "strip_size_kb": 0, 00:22:06.051 "state": "online", 00:22:06.051 "raid_level": "raid1", 00:22:06.051 "superblock": true, 00:22:06.051 "num_base_bdevs": 2, 00:22:06.051 "num_base_bdevs_discovered": 2, 00:22:06.051 "num_base_bdevs_operational": 2, 00:22:06.051 "base_bdevs_list": [ 00:22:06.051 { 00:22:06.051 "name": "BaseBdev1", 00:22:06.051 "uuid": "86d4a3d6-ad32-5227-b073-914aa5e811d0", 00:22:06.051 "is_configured": true, 00:22:06.051 "data_offset": 256, 00:22:06.051 "data_size": 7936 00:22:06.051 }, 00:22:06.051 { 00:22:06.051 "name": "BaseBdev2", 00:22:06.051 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:06.051 "is_configured": true, 00:22:06.051 "data_offset": 256, 00:22:06.051 "data_size": 7936 00:22:06.051 } 00:22:06.051 ] 00:22:06.051 }' 00:22:06.051 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.051 13:44:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:06.617 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:06.617 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:06.617 [2024-07-15 13:44:54.212201] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:06.617 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:06.875 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:07.133 [2024-07-15 13:44:54.573001] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce5d50 00:22:07.133 /dev/nbd0 00:22:07.133 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:07.134 1+0 records in 00:22:07.134 1+0 records out 00:22:07.134 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253986 s, 16.1 MB/s 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:07.134 13:44:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:22:07.699 7936+0 records in 00:22:07.699 7936+0 records out 00:22:07.699 32505856 bytes (33 MB, 31 MiB) copied, 0.505984 s, 64.2 MB/s 00:22:07.699 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:07.699 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:07.699 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:07.699 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:07.699 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:22:07.699 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:07.699 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:07.957 [2024-07-15 13:44:55.331902] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:07.957 [2024-07-15 13:44:55.498470] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.957 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.217 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.217 "name": "raid_bdev1", 00:22:08.217 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:08.217 "strip_size_kb": 0, 00:22:08.217 "state": "online", 00:22:08.217 "raid_level": "raid1", 00:22:08.217 "superblock": true, 00:22:08.217 "num_base_bdevs": 2, 00:22:08.217 "num_base_bdevs_discovered": 1, 00:22:08.217 "num_base_bdevs_operational": 1, 00:22:08.217 "base_bdevs_list": [ 00:22:08.217 { 00:22:08.217 "name": null, 00:22:08.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.217 "is_configured": false, 00:22:08.217 "data_offset": 256, 00:22:08.217 "data_size": 7936 00:22:08.217 }, 00:22:08.217 { 00:22:08.217 "name": "BaseBdev2", 00:22:08.217 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:08.217 "is_configured": true, 00:22:08.217 "data_offset": 256, 00:22:08.217 "data_size": 7936 00:22:08.217 } 00:22:08.217 ] 00:22:08.217 }' 00:22:08.217 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.217 13:44:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:08.784 13:44:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:08.784 [2024-07-15 13:44:56.324590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:08.784 [2024-07-15 13:44:56.329065] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcec5a0 00:22:08.784 [2024-07-15 13:44:56.330670] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:08.784 13:44:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:10.157 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:10.157 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:10.158 "name": "raid_bdev1", 00:22:10.158 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:10.158 "strip_size_kb": 0, 00:22:10.158 "state": "online", 00:22:10.158 "raid_level": "raid1", 00:22:10.158 "superblock": true, 00:22:10.158 "num_base_bdevs": 2, 00:22:10.158 "num_base_bdevs_discovered": 2, 00:22:10.158 "num_base_bdevs_operational": 2, 00:22:10.158 "process": { 00:22:10.158 "type": "rebuild", 00:22:10.158 "target": "spare", 00:22:10.158 "progress": { 00:22:10.158 "blocks": 2816, 00:22:10.158 "percent": 35 00:22:10.158 } 00:22:10.158 }, 00:22:10.158 "base_bdevs_list": [ 00:22:10.158 { 00:22:10.158 "name": "spare", 00:22:10.158 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:10.158 "is_configured": true, 00:22:10.158 "data_offset": 256, 00:22:10.158 "data_size": 7936 00:22:10.158 }, 00:22:10.158 { 00:22:10.158 "name": "BaseBdev2", 00:22:10.158 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:10.158 "is_configured": true, 00:22:10.158 "data_offset": 256, 00:22:10.158 "data_size": 7936 00:22:10.158 } 00:22:10.158 ] 00:22:10.158 }' 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:10.158 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:10.158 [2024-07-15 13:44:57.745563] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:10.416 [2024-07-15 13:44:57.841821] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:10.416 [2024-07-15 13:44:57.841861] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.416 [2024-07-15 13:44:57.841872] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:10.416 [2024-07-15 13:44:57.841878] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.416 13:44:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.674 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.674 "name": "raid_bdev1", 00:22:10.674 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:10.674 "strip_size_kb": 0, 00:22:10.674 "state": "online", 00:22:10.674 "raid_level": "raid1", 00:22:10.674 "superblock": true, 00:22:10.674 "num_base_bdevs": 2, 00:22:10.674 "num_base_bdevs_discovered": 1, 00:22:10.674 "num_base_bdevs_operational": 1, 00:22:10.674 "base_bdevs_list": [ 00:22:10.674 { 00:22:10.674 "name": null, 00:22:10.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.674 "is_configured": false, 00:22:10.674 "data_offset": 256, 00:22:10.674 "data_size": 7936 00:22:10.674 }, 00:22:10.674 { 00:22:10.674 "name": "BaseBdev2", 00:22:10.674 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:10.674 "is_configured": true, 00:22:10.674 "data_offset": 256, 00:22:10.674 "data_size": 7936 00:22:10.674 } 00:22:10.674 ] 00:22:10.674 }' 00:22:10.674 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.674 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:10.932 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:10.932 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:10.932 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:10.932 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:10.932 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:10.932 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.932 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.189 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.189 "name": "raid_bdev1", 00:22:11.189 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:11.189 "strip_size_kb": 0, 00:22:11.189 "state": "online", 00:22:11.189 "raid_level": "raid1", 00:22:11.189 "superblock": true, 00:22:11.189 "num_base_bdevs": 2, 00:22:11.189 "num_base_bdevs_discovered": 1, 00:22:11.189 "num_base_bdevs_operational": 1, 00:22:11.189 "base_bdevs_list": [ 00:22:11.189 { 00:22:11.189 "name": null, 00:22:11.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.189 "is_configured": false, 00:22:11.189 "data_offset": 256, 00:22:11.189 "data_size": 7936 00:22:11.189 }, 00:22:11.189 { 00:22:11.189 "name": "BaseBdev2", 00:22:11.189 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:11.189 "is_configured": true, 00:22:11.189 "data_offset": 256, 00:22:11.189 "data_size": 7936 00:22:11.189 } 00:22:11.189 ] 00:22:11.189 }' 00:22:11.189 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.189 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:11.189 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.189 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:11.189 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:11.446 [2024-07-15 13:44:58.944860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:11.446 [2024-07-15 13:44:58.950033] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcec5a0 00:22:11.446 [2024-07-15 13:44:58.951147] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:11.446 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:12.380 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:12.380 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:12.380 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:12.380 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:12.380 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:12.380 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.380 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:12.638 "name": "raid_bdev1", 00:22:12.638 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:12.638 "strip_size_kb": 0, 00:22:12.638 "state": "online", 00:22:12.638 "raid_level": "raid1", 00:22:12.638 "superblock": true, 00:22:12.638 "num_base_bdevs": 2, 00:22:12.638 "num_base_bdevs_discovered": 2, 00:22:12.638 "num_base_bdevs_operational": 2, 00:22:12.638 "process": { 00:22:12.638 "type": "rebuild", 00:22:12.638 "target": "spare", 00:22:12.638 "progress": { 00:22:12.638 "blocks": 2816, 00:22:12.638 "percent": 35 00:22:12.638 } 00:22:12.638 }, 00:22:12.638 "base_bdevs_list": [ 00:22:12.638 { 00:22:12.638 "name": "spare", 00:22:12.638 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:12.638 "is_configured": true, 00:22:12.638 "data_offset": 256, 00:22:12.638 "data_size": 7936 00:22:12.638 }, 00:22:12.638 { 00:22:12.638 "name": "BaseBdev2", 00:22:12.638 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:12.638 "is_configured": true, 00:22:12.638 "data_offset": 256, 00:22:12.638 "data_size": 7936 00:22:12.638 } 00:22:12.638 ] 00:22:12.638 }' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:12.638 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=800 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.638 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.896 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:12.896 "name": "raid_bdev1", 00:22:12.896 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:12.896 "strip_size_kb": 0, 00:22:12.896 "state": "online", 00:22:12.896 "raid_level": "raid1", 00:22:12.896 "superblock": true, 00:22:12.896 "num_base_bdevs": 2, 00:22:12.896 "num_base_bdevs_discovered": 2, 00:22:12.896 "num_base_bdevs_operational": 2, 00:22:12.896 "process": { 00:22:12.896 "type": "rebuild", 00:22:12.896 "target": "spare", 00:22:12.896 "progress": { 00:22:12.896 "blocks": 3584, 00:22:12.896 "percent": 45 00:22:12.896 } 00:22:12.896 }, 00:22:12.896 "base_bdevs_list": [ 00:22:12.896 { 00:22:12.896 "name": "spare", 00:22:12.896 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:12.896 "is_configured": true, 00:22:12.896 "data_offset": 256, 00:22:12.896 "data_size": 7936 00:22:12.896 }, 00:22:12.896 { 00:22:12.896 "name": "BaseBdev2", 00:22:12.896 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:12.896 "is_configured": true, 00:22:12.896 "data_offset": 256, 00:22:12.896 "data_size": 7936 00:22:12.896 } 00:22:12.896 ] 00:22:12.896 }' 00:22:12.896 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:12.896 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:12.896 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:12.896 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:12.896 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:14.270 "name": "raid_bdev1", 00:22:14.270 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:14.270 "strip_size_kb": 0, 00:22:14.270 "state": "online", 00:22:14.270 "raid_level": "raid1", 00:22:14.270 "superblock": true, 00:22:14.270 "num_base_bdevs": 2, 00:22:14.270 "num_base_bdevs_discovered": 2, 00:22:14.270 "num_base_bdevs_operational": 2, 00:22:14.270 "process": { 00:22:14.270 "type": "rebuild", 00:22:14.270 "target": "spare", 00:22:14.270 "progress": { 00:22:14.270 "blocks": 6656, 00:22:14.270 "percent": 83 00:22:14.270 } 00:22:14.270 }, 00:22:14.270 "base_bdevs_list": [ 00:22:14.270 { 00:22:14.270 "name": "spare", 00:22:14.270 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:14.270 "is_configured": true, 00:22:14.270 "data_offset": 256, 00:22:14.270 "data_size": 7936 00:22:14.270 }, 00:22:14.270 { 00:22:14.270 "name": "BaseBdev2", 00:22:14.270 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:14.270 "is_configured": true, 00:22:14.270 "data_offset": 256, 00:22:14.270 "data_size": 7936 00:22:14.270 } 00:22:14.270 ] 00:22:14.270 }' 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:14.270 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:14.528 [2024-07-15 13:45:02.073807] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:14.528 [2024-07-15 13:45:02.073854] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:14.528 [2024-07-15 13:45:02.073932] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:15.461 "name": "raid_bdev1", 00:22:15.461 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:15.461 "strip_size_kb": 0, 00:22:15.461 "state": "online", 00:22:15.461 "raid_level": "raid1", 00:22:15.461 "superblock": true, 00:22:15.461 "num_base_bdevs": 2, 00:22:15.461 "num_base_bdevs_discovered": 2, 00:22:15.461 "num_base_bdevs_operational": 2, 00:22:15.461 "base_bdevs_list": [ 00:22:15.461 { 00:22:15.461 "name": "spare", 00:22:15.461 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:15.461 "is_configured": true, 00:22:15.461 "data_offset": 256, 00:22:15.461 "data_size": 7936 00:22:15.461 }, 00:22:15.461 { 00:22:15.461 "name": "BaseBdev2", 00:22:15.461 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:15.461 "is_configured": true, 00:22:15.461 "data_offset": 256, 00:22:15.461 "data_size": 7936 00:22:15.461 } 00:22:15.461 ] 00:22:15.461 }' 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:15.461 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.461 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:15.719 "name": "raid_bdev1", 00:22:15.719 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:15.719 "strip_size_kb": 0, 00:22:15.719 "state": "online", 00:22:15.719 "raid_level": "raid1", 00:22:15.719 "superblock": true, 00:22:15.719 "num_base_bdevs": 2, 00:22:15.719 "num_base_bdevs_discovered": 2, 00:22:15.719 "num_base_bdevs_operational": 2, 00:22:15.719 "base_bdevs_list": [ 00:22:15.719 { 00:22:15.719 "name": "spare", 00:22:15.719 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:15.719 "is_configured": true, 00:22:15.719 "data_offset": 256, 00:22:15.719 "data_size": 7936 00:22:15.719 }, 00:22:15.719 { 00:22:15.719 "name": "BaseBdev2", 00:22:15.719 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:15.719 "is_configured": true, 00:22:15.719 "data_offset": 256, 00:22:15.719 "data_size": 7936 00:22:15.719 } 00:22:15.719 ] 00:22:15.719 }' 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.719 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.977 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.977 "name": "raid_bdev1", 00:22:15.977 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:15.977 "strip_size_kb": 0, 00:22:15.977 "state": "online", 00:22:15.977 "raid_level": "raid1", 00:22:15.977 "superblock": true, 00:22:15.977 "num_base_bdevs": 2, 00:22:15.977 "num_base_bdevs_discovered": 2, 00:22:15.977 "num_base_bdevs_operational": 2, 00:22:15.977 "base_bdevs_list": [ 00:22:15.977 { 00:22:15.977 "name": "spare", 00:22:15.977 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:15.977 "is_configured": true, 00:22:15.977 "data_offset": 256, 00:22:15.977 "data_size": 7936 00:22:15.977 }, 00:22:15.977 { 00:22:15.977 "name": "BaseBdev2", 00:22:15.977 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:15.977 "is_configured": true, 00:22:15.977 "data_offset": 256, 00:22:15.977 "data_size": 7936 00:22:15.977 } 00:22:15.977 ] 00:22:15.977 }' 00:22:15.977 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.977 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:16.542 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:16.801 [2024-07-15 13:45:04.164005] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:16.801 [2024-07-15 13:45:04.164027] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:16.801 [2024-07-15 13:45:04.164069] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:16.801 [2024-07-15 13:45:04.164110] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:16.801 [2024-07-15 13:45:04.164118] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcec930 name raid_bdev1, state offline 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:16.801 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:17.060 /dev/nbd0 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:17.060 1+0 records in 00:22:17.060 1+0 records out 00:22:17.060 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248164 s, 16.5 MB/s 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:17.060 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:17.318 /dev/nbd1 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:17.318 1+0 records in 00:22:17.318 1+0 records out 00:22:17.318 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276168 s, 14.8 MB/s 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:17.318 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:17.576 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:17.836 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:18.094 [2024-07-15 13:45:05.618642] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:18.094 [2024-07-15 13:45:05.618677] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:18.094 [2024-07-15 13:45:05.618713] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcebdc0 00:22:18.094 [2024-07-15 13:45:05.618722] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:18.094 [2024-07-15 13:45:05.619923] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:18.094 [2024-07-15 13:45:05.619946] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:18.094 [2024-07-15 13:45:05.620018] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:18.094 [2024-07-15 13:45:05.620038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:18.094 [2024-07-15 13:45:05.620111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:18.094 spare 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.094 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.352 [2024-07-15 13:45:05.720405] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcecd50 00:22:18.352 [2024-07-15 13:45:05.720419] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:18.352 [2024-07-15 13:45:05.720572] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce5810 00:22:18.352 [2024-07-15 13:45:05.720695] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcecd50 00:22:18.352 [2024-07-15 13:45:05.720703] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcecd50 00:22:18.352 [2024-07-15 13:45:05.720784] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:18.352 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.352 "name": "raid_bdev1", 00:22:18.352 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:18.352 "strip_size_kb": 0, 00:22:18.352 "state": "online", 00:22:18.352 "raid_level": "raid1", 00:22:18.352 "superblock": true, 00:22:18.352 "num_base_bdevs": 2, 00:22:18.352 "num_base_bdevs_discovered": 2, 00:22:18.352 "num_base_bdevs_operational": 2, 00:22:18.352 "base_bdevs_list": [ 00:22:18.352 { 00:22:18.352 "name": "spare", 00:22:18.352 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:18.352 "is_configured": true, 00:22:18.352 "data_offset": 256, 00:22:18.352 "data_size": 7936 00:22:18.352 }, 00:22:18.352 { 00:22:18.352 "name": "BaseBdev2", 00:22:18.352 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:18.352 "is_configured": true, 00:22:18.352 "data_offset": 256, 00:22:18.352 "data_size": 7936 00:22:18.352 } 00:22:18.352 ] 00:22:18.352 }' 00:22:18.352 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.352 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:18.918 "name": "raid_bdev1", 00:22:18.918 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:18.918 "strip_size_kb": 0, 00:22:18.918 "state": "online", 00:22:18.918 "raid_level": "raid1", 00:22:18.918 "superblock": true, 00:22:18.918 "num_base_bdevs": 2, 00:22:18.918 "num_base_bdevs_discovered": 2, 00:22:18.918 "num_base_bdevs_operational": 2, 00:22:18.918 "base_bdevs_list": [ 00:22:18.918 { 00:22:18.918 "name": "spare", 00:22:18.918 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:18.918 "is_configured": true, 00:22:18.918 "data_offset": 256, 00:22:18.918 "data_size": 7936 00:22:18.918 }, 00:22:18.918 { 00:22:18.918 "name": "BaseBdev2", 00:22:18.918 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:18.918 "is_configured": true, 00:22:18.918 "data_offset": 256, 00:22:18.918 "data_size": 7936 00:22:18.918 } 00:22:18.918 ] 00:22:18.918 }' 00:22:18.918 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:19.176 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:19.176 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:19.176 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:19.176 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.176 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:19.176 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:19.176 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:19.433 [2024-07-15 13:45:06.934124] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.433 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.691 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.691 "name": "raid_bdev1", 00:22:19.691 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:19.691 "strip_size_kb": 0, 00:22:19.691 "state": "online", 00:22:19.691 "raid_level": "raid1", 00:22:19.691 "superblock": true, 00:22:19.691 "num_base_bdevs": 2, 00:22:19.691 "num_base_bdevs_discovered": 1, 00:22:19.691 "num_base_bdevs_operational": 1, 00:22:19.691 "base_bdevs_list": [ 00:22:19.691 { 00:22:19.691 "name": null, 00:22:19.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.691 "is_configured": false, 00:22:19.691 "data_offset": 256, 00:22:19.691 "data_size": 7936 00:22:19.691 }, 00:22:19.691 { 00:22:19.691 "name": "BaseBdev2", 00:22:19.691 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:19.691 "is_configured": true, 00:22:19.691 "data_offset": 256, 00:22:19.691 "data_size": 7936 00:22:19.691 } 00:22:19.691 ] 00:22:19.691 }' 00:22:19.691 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.691 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:20.268 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:20.268 [2024-07-15 13:45:07.800361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:20.268 [2024-07-15 13:45:07.800475] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:20.268 [2024-07-15 13:45:07.800487] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:20.268 [2024-07-15 13:45:07.800507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:20.268 [2024-07-15 13:45:07.804944] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x816fc0 00:22:20.268 [2024-07-15 13:45:07.806648] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:20.268 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:21.641 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:21.641 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:21.641 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:21.641 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:21.641 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:21.641 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.641 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.641 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:21.641 "name": "raid_bdev1", 00:22:21.641 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:21.641 "strip_size_kb": 0, 00:22:21.641 "state": "online", 00:22:21.641 "raid_level": "raid1", 00:22:21.641 "superblock": true, 00:22:21.641 "num_base_bdevs": 2, 00:22:21.641 "num_base_bdevs_discovered": 2, 00:22:21.641 "num_base_bdevs_operational": 2, 00:22:21.641 "process": { 00:22:21.641 "type": "rebuild", 00:22:21.641 "target": "spare", 00:22:21.641 "progress": { 00:22:21.641 "blocks": 2816, 00:22:21.641 "percent": 35 00:22:21.641 } 00:22:21.641 }, 00:22:21.641 "base_bdevs_list": [ 00:22:21.641 { 00:22:21.641 "name": "spare", 00:22:21.641 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:21.641 "is_configured": true, 00:22:21.641 "data_offset": 256, 00:22:21.641 "data_size": 7936 00:22:21.641 }, 00:22:21.641 { 00:22:21.641 "name": "BaseBdev2", 00:22:21.641 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:21.641 "is_configured": true, 00:22:21.641 "data_offset": 256, 00:22:21.641 "data_size": 7936 00:22:21.641 } 00:22:21.641 ] 00:22:21.641 }' 00:22:21.641 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:21.641 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:21.641 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:21.641 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:21.641 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:21.897 [2024-07-15 13:45:09.261394] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:21.897 [2024-07-15 13:45:09.317503] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:21.897 [2024-07-15 13:45:09.317532] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.897 [2024-07-15 13:45:09.317542] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:21.897 [2024-07-15 13:45:09.317551] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.897 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.153 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.153 "name": "raid_bdev1", 00:22:22.153 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:22.153 "strip_size_kb": 0, 00:22:22.153 "state": "online", 00:22:22.153 "raid_level": "raid1", 00:22:22.153 "superblock": true, 00:22:22.153 "num_base_bdevs": 2, 00:22:22.153 "num_base_bdevs_discovered": 1, 00:22:22.153 "num_base_bdevs_operational": 1, 00:22:22.153 "base_bdevs_list": [ 00:22:22.153 { 00:22:22.153 "name": null, 00:22:22.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.153 "is_configured": false, 00:22:22.153 "data_offset": 256, 00:22:22.153 "data_size": 7936 00:22:22.153 }, 00:22:22.153 { 00:22:22.153 "name": "BaseBdev2", 00:22:22.153 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:22.153 "is_configured": true, 00:22:22.153 "data_offset": 256, 00:22:22.153 "data_size": 7936 00:22:22.153 } 00:22:22.153 ] 00:22:22.153 }' 00:22:22.153 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.153 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:22.717 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:22.717 [2024-07-15 13:45:10.180572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:22.717 [2024-07-15 13:45:10.180613] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.717 [2024-07-15 13:45:10.180632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb33350 00:22:22.717 [2024-07-15 13:45:10.180641] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.717 [2024-07-15 13:45:10.180926] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.717 [2024-07-15 13:45:10.180939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:22.717 [2024-07-15 13:45:10.181007] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:22.717 [2024-07-15 13:45:10.181016] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:22.717 [2024-07-15 13:45:10.181023] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:22.717 [2024-07-15 13:45:10.181037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:22.717 [2024-07-15 13:45:10.185462] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcebac0 00:22:22.717 [2024-07-15 13:45:10.186525] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:22.717 spare 00:22:22.717 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:23.647 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:23.647 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:23.647 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:23.647 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:23.647 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:23.647 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.647 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.922 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:23.922 "name": "raid_bdev1", 00:22:23.922 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:23.922 "strip_size_kb": 0, 00:22:23.922 "state": "online", 00:22:23.922 "raid_level": "raid1", 00:22:23.922 "superblock": true, 00:22:23.922 "num_base_bdevs": 2, 00:22:23.922 "num_base_bdevs_discovered": 2, 00:22:23.922 "num_base_bdevs_operational": 2, 00:22:23.922 "process": { 00:22:23.922 "type": "rebuild", 00:22:23.922 "target": "spare", 00:22:23.922 "progress": { 00:22:23.922 "blocks": 2816, 00:22:23.922 "percent": 35 00:22:23.922 } 00:22:23.922 }, 00:22:23.922 "base_bdevs_list": [ 00:22:23.922 { 00:22:23.922 "name": "spare", 00:22:23.922 "uuid": "aadee171-2cd3-5025-9585-20b9cd6c5fd4", 00:22:23.922 "is_configured": true, 00:22:23.922 "data_offset": 256, 00:22:23.922 "data_size": 7936 00:22:23.922 }, 00:22:23.922 { 00:22:23.922 "name": "BaseBdev2", 00:22:23.922 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:23.922 "is_configured": true, 00:22:23.922 "data_offset": 256, 00:22:23.922 "data_size": 7936 00:22:23.922 } 00:22:23.922 ] 00:22:23.922 }' 00:22:23.922 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:23.922 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:23.922 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:23.922 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:23.922 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:24.180 [2024-07-15 13:45:11.621153] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:24.180 [2024-07-15 13:45:11.697800] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:24.180 [2024-07-15 13:45:11.697836] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:24.180 [2024-07-15 13:45:11.697863] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:24.180 [2024-07-15 13:45:11.697869] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.180 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.438 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.438 "name": "raid_bdev1", 00:22:24.438 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:24.438 "strip_size_kb": 0, 00:22:24.438 "state": "online", 00:22:24.438 "raid_level": "raid1", 00:22:24.438 "superblock": true, 00:22:24.438 "num_base_bdevs": 2, 00:22:24.438 "num_base_bdevs_discovered": 1, 00:22:24.438 "num_base_bdevs_operational": 1, 00:22:24.438 "base_bdevs_list": [ 00:22:24.438 { 00:22:24.438 "name": null, 00:22:24.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.439 "is_configured": false, 00:22:24.439 "data_offset": 256, 00:22:24.439 "data_size": 7936 00:22:24.439 }, 00:22:24.439 { 00:22:24.439 "name": "BaseBdev2", 00:22:24.439 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:24.439 "is_configured": true, 00:22:24.439 "data_offset": 256, 00:22:24.439 "data_size": 7936 00:22:24.439 } 00:22:24.439 ] 00:22:24.439 }' 00:22:24.439 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.439 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:25.003 "name": "raid_bdev1", 00:22:25.003 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:25.003 "strip_size_kb": 0, 00:22:25.003 "state": "online", 00:22:25.003 "raid_level": "raid1", 00:22:25.003 "superblock": true, 00:22:25.003 "num_base_bdevs": 2, 00:22:25.003 "num_base_bdevs_discovered": 1, 00:22:25.003 "num_base_bdevs_operational": 1, 00:22:25.003 "base_bdevs_list": [ 00:22:25.003 { 00:22:25.003 "name": null, 00:22:25.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:25.003 "is_configured": false, 00:22:25.003 "data_offset": 256, 00:22:25.003 "data_size": 7936 00:22:25.003 }, 00:22:25.003 { 00:22:25.003 "name": "BaseBdev2", 00:22:25.003 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:25.003 "is_configured": true, 00:22:25.003 "data_offset": 256, 00:22:25.003 "data_size": 7936 00:22:25.003 } 00:22:25.003 ] 00:22:25.003 }' 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:25.003 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:25.261 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:25.261 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:25.261 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:25.261 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:25.519 [2024-07-15 13:45:12.973896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:25.519 [2024-07-15 13:45:12.973930] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:25.519 [2024-07-15 13:45:12.973962] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcebff0 00:22:25.519 [2024-07-15 13:45:12.973970] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:25.519 [2024-07-15 13:45:12.974228] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:25.519 [2024-07-15 13:45:12.974240] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:25.519 [2024-07-15 13:45:12.974286] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:25.519 [2024-07-15 13:45:12.974299] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:25.519 [2024-07-15 13:45:12.974307] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:25.519 BaseBdev1 00:22:25.519 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:26.463 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:26.463 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:26.463 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:26.463 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.463 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.721 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:26.721 "name": "raid_bdev1", 00:22:26.721 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:26.721 "strip_size_kb": 0, 00:22:26.721 "state": "online", 00:22:26.721 "raid_level": "raid1", 00:22:26.721 "superblock": true, 00:22:26.721 "num_base_bdevs": 2, 00:22:26.721 "num_base_bdevs_discovered": 1, 00:22:26.721 "num_base_bdevs_operational": 1, 00:22:26.721 "base_bdevs_list": [ 00:22:26.721 { 00:22:26.721 "name": null, 00:22:26.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.721 "is_configured": false, 00:22:26.721 "data_offset": 256, 00:22:26.721 "data_size": 7936 00:22:26.721 }, 00:22:26.721 { 00:22:26.721 "name": "BaseBdev2", 00:22:26.721 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:26.721 "is_configured": true, 00:22:26.721 "data_offset": 256, 00:22:26.721 "data_size": 7936 00:22:26.721 } 00:22:26.721 ] 00:22:26.721 }' 00:22:26.721 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:26.721 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:27.286 "name": "raid_bdev1", 00:22:27.286 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:27.286 "strip_size_kb": 0, 00:22:27.286 "state": "online", 00:22:27.286 "raid_level": "raid1", 00:22:27.286 "superblock": true, 00:22:27.286 "num_base_bdevs": 2, 00:22:27.286 "num_base_bdevs_discovered": 1, 00:22:27.286 "num_base_bdevs_operational": 1, 00:22:27.286 "base_bdevs_list": [ 00:22:27.286 { 00:22:27.286 "name": null, 00:22:27.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.286 "is_configured": false, 00:22:27.286 "data_offset": 256, 00:22:27.286 "data_size": 7936 00:22:27.286 }, 00:22:27.286 { 00:22:27.286 "name": "BaseBdev2", 00:22:27.286 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:27.286 "is_configured": true, 00:22:27.286 "data_offset": 256, 00:22:27.286 "data_size": 7936 00:22:27.286 } 00:22:27.286 ] 00:22:27.286 }' 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:27.286 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:27.544 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:27.544 [2024-07-15 13:45:15.099535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:27.544 [2024-07-15 13:45:15.099634] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:27.544 [2024-07-15 13:45:15.099644] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:27.544 request: 00:22:27.544 { 00:22:27.544 "base_bdev": "BaseBdev1", 00:22:27.544 "raid_bdev": "raid_bdev1", 00:22:27.544 "method": "bdev_raid_add_base_bdev", 00:22:27.544 "req_id": 1 00:22:27.544 } 00:22:27.544 Got JSON-RPC error response 00:22:27.544 response: 00:22:27.544 { 00:22:27.544 "code": -22, 00:22:27.544 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:27.544 } 00:22:27.544 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:22:27.544 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:27.544 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:27.544 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:27.544 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.916 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.917 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.917 "name": "raid_bdev1", 00:22:28.917 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:28.917 "strip_size_kb": 0, 00:22:28.917 "state": "online", 00:22:28.917 "raid_level": "raid1", 00:22:28.917 "superblock": true, 00:22:28.917 "num_base_bdevs": 2, 00:22:28.917 "num_base_bdevs_discovered": 1, 00:22:28.917 "num_base_bdevs_operational": 1, 00:22:28.917 "base_bdevs_list": [ 00:22:28.917 { 00:22:28.917 "name": null, 00:22:28.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.917 "is_configured": false, 00:22:28.917 "data_offset": 256, 00:22:28.917 "data_size": 7936 00:22:28.917 }, 00:22:28.917 { 00:22:28.917 "name": "BaseBdev2", 00:22:28.917 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:28.917 "is_configured": true, 00:22:28.917 "data_offset": 256, 00:22:28.917 "data_size": 7936 00:22:28.917 } 00:22:28.917 ] 00:22:28.917 }' 00:22:28.917 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.917 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.481 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:29.481 "name": "raid_bdev1", 00:22:29.481 "uuid": "03d629f4-fa4f-4ffd-b16a-6666dfaf96ce", 00:22:29.481 "strip_size_kb": 0, 00:22:29.481 "state": "online", 00:22:29.481 "raid_level": "raid1", 00:22:29.481 "superblock": true, 00:22:29.481 "num_base_bdevs": 2, 00:22:29.481 "num_base_bdevs_discovered": 1, 00:22:29.481 "num_base_bdevs_operational": 1, 00:22:29.481 "base_bdevs_list": [ 00:22:29.481 { 00:22:29.481 "name": null, 00:22:29.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.481 "is_configured": false, 00:22:29.481 "data_offset": 256, 00:22:29.481 "data_size": 7936 00:22:29.481 }, 00:22:29.481 { 00:22:29.481 "name": "BaseBdev2", 00:22:29.481 "uuid": "fe0d82a1-acae-5b12-8ca0-1bb00740fc13", 00:22:29.481 "is_configured": true, 00:22:29.482 "data_offset": 256, 00:22:29.482 "data_size": 7936 00:22:29.482 } 00:22:29.482 ] 00:22:29.482 }' 00:22:29.482 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 88414 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 88414 ']' 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 88414 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 88414 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 88414' 00:22:29.482 killing process with pid 88414 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 88414 00:22:29.482 Received shutdown signal, test time was about 60.000000 seconds 00:22:29.482 00:22:29.482 Latency(us) 00:22:29.482 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:29.482 =================================================================================================================== 00:22:29.482 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:29.482 [2024-07-15 13:45:17.097257] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:29.482 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 88414 00:22:29.482 [2024-07-15 13:45:17.097327] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:29.482 [2024-07-15 13:45:17.097360] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:29.482 [2024-07-15 13:45:17.097368] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcecd50 name raid_bdev1, state offline 00:22:29.739 [2024-07-15 13:45:17.127570] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:29.739 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:22:29.739 00:22:29.739 real 0m26.145s 00:22:29.739 user 0m39.455s 00:22:29.739 sys 0m4.205s 00:22:29.739 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:29.739 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:29.739 ************************************ 00:22:29.739 END TEST raid_rebuild_test_sb_4k 00:22:29.739 ************************************ 00:22:29.997 13:45:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:29.997 13:45:17 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:22:29.997 13:45:17 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:22:29.997 13:45:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:29.997 13:45:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:29.997 13:45:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:29.997 ************************************ 00:22:29.997 START TEST raid_state_function_test_sb_md_separate 00:22:29.997 ************************************ 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=92798 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 92798' 00:22:29.997 Process raid pid: 92798 00:22:29.997 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 92798 /var/tmp/spdk-raid.sock 00:22:29.998 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:29.998 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 92798 ']' 00:22:29.998 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:29.998 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:29.998 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:29.998 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:29.998 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:29.998 13:45:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:29.998 [2024-07-15 13:45:17.459203] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:22:29.998 [2024-07-15 13:45:17.459256] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:29.998 [2024-07-15 13:45:17.545731] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:30.255 [2024-07-15 13:45:17.634199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:30.255 [2024-07-15 13:45:17.683919] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.255 [2024-07-15 13:45:17.683942] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.820 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:30.820 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:22:30.820 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:30.820 [2024-07-15 13:45:18.428463] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:30.820 [2024-07-15 13:45:18.428497] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:30.820 [2024-07-15 13:45:18.428507] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:30.820 [2024-07-15 13:45:18.428515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.077 "name": "Existed_Raid", 00:22:31.077 "uuid": "62dbedfc-d9bd-4c61-920e-944bbb40075d", 00:22:31.077 "strip_size_kb": 0, 00:22:31.077 "state": "configuring", 00:22:31.077 "raid_level": "raid1", 00:22:31.077 "superblock": true, 00:22:31.077 "num_base_bdevs": 2, 00:22:31.077 "num_base_bdevs_discovered": 0, 00:22:31.077 "num_base_bdevs_operational": 2, 00:22:31.077 "base_bdevs_list": [ 00:22:31.077 { 00:22:31.077 "name": "BaseBdev1", 00:22:31.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:31.077 "is_configured": false, 00:22:31.077 "data_offset": 0, 00:22:31.077 "data_size": 0 00:22:31.077 }, 00:22:31.077 { 00:22:31.077 "name": "BaseBdev2", 00:22:31.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:31.077 "is_configured": false, 00:22:31.077 "data_offset": 0, 00:22:31.077 "data_size": 0 00:22:31.077 } 00:22:31.077 ] 00:22:31.077 }' 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.077 13:45:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:31.640 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:31.896 [2024-07-15 13:45:19.298612] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:31.896 [2024-07-15 13:45:19.298635] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f2f30 name Existed_Raid, state configuring 00:22:31.896 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:31.896 [2024-07-15 13:45:19.471080] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:31.896 [2024-07-15 13:45:19.471104] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:31.896 [2024-07-15 13:45:19.471110] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:31.896 [2024-07-15 13:45:19.471118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:31.896 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:22:32.152 [2024-07-15 13:45:19.656878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:32.152 BaseBdev1 00:22:32.152 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:32.152 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:32.152 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:32.152 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:22:32.152 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:32.152 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:32.152 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:32.408 13:45:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:32.408 [ 00:22:32.408 { 00:22:32.408 "name": "BaseBdev1", 00:22:32.408 "aliases": [ 00:22:32.408 "77cfe407-0644-43a8-8ecd-739d5dd258d6" 00:22:32.408 ], 00:22:32.408 "product_name": "Malloc disk", 00:22:32.408 "block_size": 4096, 00:22:32.408 "num_blocks": 8192, 00:22:32.408 "uuid": "77cfe407-0644-43a8-8ecd-739d5dd258d6", 00:22:32.408 "md_size": 32, 00:22:32.408 "md_interleave": false, 00:22:32.408 "dif_type": 0, 00:22:32.408 "assigned_rate_limits": { 00:22:32.408 "rw_ios_per_sec": 0, 00:22:32.408 "rw_mbytes_per_sec": 0, 00:22:32.408 "r_mbytes_per_sec": 0, 00:22:32.408 "w_mbytes_per_sec": 0 00:22:32.408 }, 00:22:32.408 "claimed": true, 00:22:32.408 "claim_type": "exclusive_write", 00:22:32.408 "zoned": false, 00:22:32.408 "supported_io_types": { 00:22:32.408 "read": true, 00:22:32.408 "write": true, 00:22:32.408 "unmap": true, 00:22:32.408 "flush": true, 00:22:32.408 "reset": true, 00:22:32.408 "nvme_admin": false, 00:22:32.408 "nvme_io": false, 00:22:32.408 "nvme_io_md": false, 00:22:32.408 "write_zeroes": true, 00:22:32.408 "zcopy": true, 00:22:32.408 "get_zone_info": false, 00:22:32.408 "zone_management": false, 00:22:32.408 "zone_append": false, 00:22:32.408 "compare": false, 00:22:32.408 "compare_and_write": false, 00:22:32.408 "abort": true, 00:22:32.408 "seek_hole": false, 00:22:32.408 "seek_data": false, 00:22:32.408 "copy": true, 00:22:32.408 "nvme_iov_md": false 00:22:32.408 }, 00:22:32.408 "memory_domains": [ 00:22:32.408 { 00:22:32.408 "dma_device_id": "system", 00:22:32.408 "dma_device_type": 1 00:22:32.408 }, 00:22:32.408 { 00:22:32.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:32.408 "dma_device_type": 2 00:22:32.408 } 00:22:32.408 ], 00:22:32.408 "driver_specific": {} 00:22:32.408 } 00:22:32.408 ] 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.664 "name": "Existed_Raid", 00:22:32.664 "uuid": "ddbc0151-8fee-4834-90d2-f233dbbf22df", 00:22:32.664 "strip_size_kb": 0, 00:22:32.664 "state": "configuring", 00:22:32.664 "raid_level": "raid1", 00:22:32.664 "superblock": true, 00:22:32.664 "num_base_bdevs": 2, 00:22:32.664 "num_base_bdevs_discovered": 1, 00:22:32.664 "num_base_bdevs_operational": 2, 00:22:32.664 "base_bdevs_list": [ 00:22:32.664 { 00:22:32.664 "name": "BaseBdev1", 00:22:32.664 "uuid": "77cfe407-0644-43a8-8ecd-739d5dd258d6", 00:22:32.664 "is_configured": true, 00:22:32.664 "data_offset": 256, 00:22:32.664 "data_size": 7936 00:22:32.664 }, 00:22:32.664 { 00:22:32.664 "name": "BaseBdev2", 00:22:32.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.664 "is_configured": false, 00:22:32.664 "data_offset": 0, 00:22:32.664 "data_size": 0 00:22:32.664 } 00:22:32.664 ] 00:22:32.664 }' 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.664 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:33.225 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:33.481 [2024-07-15 13:45:20.876055] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:33.481 [2024-07-15 13:45:20.876088] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f2820 name Existed_Raid, state configuring 00:22:33.481 13:45:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:33.481 [2024-07-15 13:45:21.048537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:33.481 [2024-07-15 13:45:21.049570] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:33.481 [2024-07-15 13:45:21.049597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.481 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:33.738 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.738 "name": "Existed_Raid", 00:22:33.738 "uuid": "c8742ea7-5502-460d-bcf7-27caab95b7a3", 00:22:33.738 "strip_size_kb": 0, 00:22:33.738 "state": "configuring", 00:22:33.738 "raid_level": "raid1", 00:22:33.738 "superblock": true, 00:22:33.738 "num_base_bdevs": 2, 00:22:33.738 "num_base_bdevs_discovered": 1, 00:22:33.738 "num_base_bdevs_operational": 2, 00:22:33.738 "base_bdevs_list": [ 00:22:33.738 { 00:22:33.738 "name": "BaseBdev1", 00:22:33.738 "uuid": "77cfe407-0644-43a8-8ecd-739d5dd258d6", 00:22:33.738 "is_configured": true, 00:22:33.738 "data_offset": 256, 00:22:33.738 "data_size": 7936 00:22:33.738 }, 00:22:33.738 { 00:22:33.738 "name": "BaseBdev2", 00:22:33.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.738 "is_configured": false, 00:22:33.738 "data_offset": 0, 00:22:33.738 "data_size": 0 00:22:33.738 } 00:22:33.738 ] 00:22:33.738 }' 00:22:33.738 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.738 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:34.301 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:22:34.558 [2024-07-15 13:45:21.926287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:34.558 [2024-07-15 13:45:21.926401] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f4750 00:22:34.558 [2024-07-15 13:45:21.926410] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:34.558 [2024-07-15 13:45:21.926453] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16f2dc0 00:22:34.558 [2024-07-15 13:45:21.926521] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f4750 00:22:34.558 [2024-07-15 13:45:21.926527] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16f4750 00:22:34.558 [2024-07-15 13:45:21.926572] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:34.558 BaseBdev2 00:22:34.558 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:34.558 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:34.558 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:34.558 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:22:34.558 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:34.558 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:34.558 13:45:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:34.558 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:34.814 [ 00:22:34.814 { 00:22:34.814 "name": "BaseBdev2", 00:22:34.814 "aliases": [ 00:22:34.814 "1ab4f1fb-7f71-414c-a9d9-07445389badb" 00:22:34.814 ], 00:22:34.814 "product_name": "Malloc disk", 00:22:34.814 "block_size": 4096, 00:22:34.814 "num_blocks": 8192, 00:22:34.814 "uuid": "1ab4f1fb-7f71-414c-a9d9-07445389badb", 00:22:34.814 "md_size": 32, 00:22:34.814 "md_interleave": false, 00:22:34.814 "dif_type": 0, 00:22:34.814 "assigned_rate_limits": { 00:22:34.814 "rw_ios_per_sec": 0, 00:22:34.814 "rw_mbytes_per_sec": 0, 00:22:34.814 "r_mbytes_per_sec": 0, 00:22:34.814 "w_mbytes_per_sec": 0 00:22:34.814 }, 00:22:34.814 "claimed": true, 00:22:34.814 "claim_type": "exclusive_write", 00:22:34.814 "zoned": false, 00:22:34.814 "supported_io_types": { 00:22:34.814 "read": true, 00:22:34.814 "write": true, 00:22:34.814 "unmap": true, 00:22:34.814 "flush": true, 00:22:34.814 "reset": true, 00:22:34.814 "nvme_admin": false, 00:22:34.814 "nvme_io": false, 00:22:34.814 "nvme_io_md": false, 00:22:34.814 "write_zeroes": true, 00:22:34.814 "zcopy": true, 00:22:34.814 "get_zone_info": false, 00:22:34.814 "zone_management": false, 00:22:34.814 "zone_append": false, 00:22:34.814 "compare": false, 00:22:34.814 "compare_and_write": false, 00:22:34.814 "abort": true, 00:22:34.814 "seek_hole": false, 00:22:34.814 "seek_data": false, 00:22:34.814 "copy": true, 00:22:34.814 "nvme_iov_md": false 00:22:34.814 }, 00:22:34.814 "memory_domains": [ 00:22:34.814 { 00:22:34.814 "dma_device_id": "system", 00:22:34.814 "dma_device_type": 1 00:22:34.814 }, 00:22:34.814 { 00:22:34.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.814 "dma_device_type": 2 00:22:34.814 } 00:22:34.814 ], 00:22:34.814 "driver_specific": {} 00:22:34.814 } 00:22:34.814 ] 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.814 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:35.069 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.069 "name": "Existed_Raid", 00:22:35.069 "uuid": "c8742ea7-5502-460d-bcf7-27caab95b7a3", 00:22:35.069 "strip_size_kb": 0, 00:22:35.069 "state": "online", 00:22:35.069 "raid_level": "raid1", 00:22:35.069 "superblock": true, 00:22:35.069 "num_base_bdevs": 2, 00:22:35.069 "num_base_bdevs_discovered": 2, 00:22:35.069 "num_base_bdevs_operational": 2, 00:22:35.070 "base_bdevs_list": [ 00:22:35.070 { 00:22:35.070 "name": "BaseBdev1", 00:22:35.070 "uuid": "77cfe407-0644-43a8-8ecd-739d5dd258d6", 00:22:35.070 "is_configured": true, 00:22:35.070 "data_offset": 256, 00:22:35.070 "data_size": 7936 00:22:35.070 }, 00:22:35.070 { 00:22:35.070 "name": "BaseBdev2", 00:22:35.070 "uuid": "1ab4f1fb-7f71-414c-a9d9-07445389badb", 00:22:35.070 "is_configured": true, 00:22:35.070 "data_offset": 256, 00:22:35.070 "data_size": 7936 00:22:35.070 } 00:22:35.070 ] 00:22:35.070 }' 00:22:35.070 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.070 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:35.653 13:45:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:35.654 [2024-07-15 13:45:23.113558] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:35.654 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:35.654 "name": "Existed_Raid", 00:22:35.654 "aliases": [ 00:22:35.654 "c8742ea7-5502-460d-bcf7-27caab95b7a3" 00:22:35.654 ], 00:22:35.654 "product_name": "Raid Volume", 00:22:35.654 "block_size": 4096, 00:22:35.654 "num_blocks": 7936, 00:22:35.654 "uuid": "c8742ea7-5502-460d-bcf7-27caab95b7a3", 00:22:35.654 "md_size": 32, 00:22:35.654 "md_interleave": false, 00:22:35.654 "dif_type": 0, 00:22:35.654 "assigned_rate_limits": { 00:22:35.654 "rw_ios_per_sec": 0, 00:22:35.654 "rw_mbytes_per_sec": 0, 00:22:35.654 "r_mbytes_per_sec": 0, 00:22:35.654 "w_mbytes_per_sec": 0 00:22:35.654 }, 00:22:35.654 "claimed": false, 00:22:35.654 "zoned": false, 00:22:35.654 "supported_io_types": { 00:22:35.654 "read": true, 00:22:35.654 "write": true, 00:22:35.654 "unmap": false, 00:22:35.654 "flush": false, 00:22:35.654 "reset": true, 00:22:35.654 "nvme_admin": false, 00:22:35.654 "nvme_io": false, 00:22:35.654 "nvme_io_md": false, 00:22:35.654 "write_zeroes": true, 00:22:35.654 "zcopy": false, 00:22:35.654 "get_zone_info": false, 00:22:35.654 "zone_management": false, 00:22:35.654 "zone_append": false, 00:22:35.654 "compare": false, 00:22:35.654 "compare_and_write": false, 00:22:35.654 "abort": false, 00:22:35.654 "seek_hole": false, 00:22:35.654 "seek_data": false, 00:22:35.654 "copy": false, 00:22:35.654 "nvme_iov_md": false 00:22:35.654 }, 00:22:35.654 "memory_domains": [ 00:22:35.654 { 00:22:35.654 "dma_device_id": "system", 00:22:35.654 "dma_device_type": 1 00:22:35.654 }, 00:22:35.654 { 00:22:35.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.654 "dma_device_type": 2 00:22:35.654 }, 00:22:35.654 { 00:22:35.654 "dma_device_id": "system", 00:22:35.654 "dma_device_type": 1 00:22:35.654 }, 00:22:35.654 { 00:22:35.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.654 "dma_device_type": 2 00:22:35.654 } 00:22:35.654 ], 00:22:35.654 "driver_specific": { 00:22:35.654 "raid": { 00:22:35.654 "uuid": "c8742ea7-5502-460d-bcf7-27caab95b7a3", 00:22:35.654 "strip_size_kb": 0, 00:22:35.654 "state": "online", 00:22:35.654 "raid_level": "raid1", 00:22:35.654 "superblock": true, 00:22:35.654 "num_base_bdevs": 2, 00:22:35.654 "num_base_bdevs_discovered": 2, 00:22:35.654 "num_base_bdevs_operational": 2, 00:22:35.654 "base_bdevs_list": [ 00:22:35.654 { 00:22:35.654 "name": "BaseBdev1", 00:22:35.654 "uuid": "77cfe407-0644-43a8-8ecd-739d5dd258d6", 00:22:35.654 "is_configured": true, 00:22:35.654 "data_offset": 256, 00:22:35.654 "data_size": 7936 00:22:35.654 }, 00:22:35.654 { 00:22:35.654 "name": "BaseBdev2", 00:22:35.654 "uuid": "1ab4f1fb-7f71-414c-a9d9-07445389badb", 00:22:35.654 "is_configured": true, 00:22:35.654 "data_offset": 256, 00:22:35.654 "data_size": 7936 00:22:35.654 } 00:22:35.654 ] 00:22:35.654 } 00:22:35.654 } 00:22:35.654 }' 00:22:35.654 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:35.654 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:35.654 BaseBdev2' 00:22:35.654 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:35.654 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:35.654 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:35.981 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:35.981 "name": "BaseBdev1", 00:22:35.981 "aliases": [ 00:22:35.981 "77cfe407-0644-43a8-8ecd-739d5dd258d6" 00:22:35.981 ], 00:22:35.981 "product_name": "Malloc disk", 00:22:35.981 "block_size": 4096, 00:22:35.981 "num_blocks": 8192, 00:22:35.981 "uuid": "77cfe407-0644-43a8-8ecd-739d5dd258d6", 00:22:35.981 "md_size": 32, 00:22:35.981 "md_interleave": false, 00:22:35.981 "dif_type": 0, 00:22:35.981 "assigned_rate_limits": { 00:22:35.981 "rw_ios_per_sec": 0, 00:22:35.982 "rw_mbytes_per_sec": 0, 00:22:35.982 "r_mbytes_per_sec": 0, 00:22:35.982 "w_mbytes_per_sec": 0 00:22:35.982 }, 00:22:35.982 "claimed": true, 00:22:35.982 "claim_type": "exclusive_write", 00:22:35.982 "zoned": false, 00:22:35.982 "supported_io_types": { 00:22:35.982 "read": true, 00:22:35.982 "write": true, 00:22:35.982 "unmap": true, 00:22:35.982 "flush": true, 00:22:35.982 "reset": true, 00:22:35.982 "nvme_admin": false, 00:22:35.982 "nvme_io": false, 00:22:35.982 "nvme_io_md": false, 00:22:35.982 "write_zeroes": true, 00:22:35.982 "zcopy": true, 00:22:35.982 "get_zone_info": false, 00:22:35.982 "zone_management": false, 00:22:35.982 "zone_append": false, 00:22:35.982 "compare": false, 00:22:35.982 "compare_and_write": false, 00:22:35.982 "abort": true, 00:22:35.982 "seek_hole": false, 00:22:35.982 "seek_data": false, 00:22:35.982 "copy": true, 00:22:35.982 "nvme_iov_md": false 00:22:35.982 }, 00:22:35.982 "memory_domains": [ 00:22:35.982 { 00:22:35.982 "dma_device_id": "system", 00:22:35.982 "dma_device_type": 1 00:22:35.982 }, 00:22:35.982 { 00:22:35.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.982 "dma_device_type": 2 00:22:35.982 } 00:22:35.982 ], 00:22:35.982 "driver_specific": {} 00:22:35.982 }' 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:35.982 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.247 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.247 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:36.247 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:36.247 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:36.247 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:36.247 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:36.247 "name": "BaseBdev2", 00:22:36.247 "aliases": [ 00:22:36.247 "1ab4f1fb-7f71-414c-a9d9-07445389badb" 00:22:36.247 ], 00:22:36.247 "product_name": "Malloc disk", 00:22:36.247 "block_size": 4096, 00:22:36.247 "num_blocks": 8192, 00:22:36.247 "uuid": "1ab4f1fb-7f71-414c-a9d9-07445389badb", 00:22:36.247 "md_size": 32, 00:22:36.247 "md_interleave": false, 00:22:36.247 "dif_type": 0, 00:22:36.247 "assigned_rate_limits": { 00:22:36.247 "rw_ios_per_sec": 0, 00:22:36.247 "rw_mbytes_per_sec": 0, 00:22:36.247 "r_mbytes_per_sec": 0, 00:22:36.247 "w_mbytes_per_sec": 0 00:22:36.247 }, 00:22:36.247 "claimed": true, 00:22:36.247 "claim_type": "exclusive_write", 00:22:36.247 "zoned": false, 00:22:36.247 "supported_io_types": { 00:22:36.247 "read": true, 00:22:36.247 "write": true, 00:22:36.247 "unmap": true, 00:22:36.247 "flush": true, 00:22:36.247 "reset": true, 00:22:36.247 "nvme_admin": false, 00:22:36.247 "nvme_io": false, 00:22:36.247 "nvme_io_md": false, 00:22:36.247 "write_zeroes": true, 00:22:36.247 "zcopy": true, 00:22:36.247 "get_zone_info": false, 00:22:36.247 "zone_management": false, 00:22:36.247 "zone_append": false, 00:22:36.247 "compare": false, 00:22:36.247 "compare_and_write": false, 00:22:36.247 "abort": true, 00:22:36.247 "seek_hole": false, 00:22:36.247 "seek_data": false, 00:22:36.247 "copy": true, 00:22:36.247 "nvme_iov_md": false 00:22:36.247 }, 00:22:36.247 "memory_domains": [ 00:22:36.247 { 00:22:36.247 "dma_device_id": "system", 00:22:36.247 "dma_device_type": 1 00:22:36.247 }, 00:22:36.247 { 00:22:36.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.247 "dma_device_type": 2 00:22:36.247 } 00:22:36.247 ], 00:22:36.247 "driver_specific": {} 00:22:36.247 }' 00:22:36.247 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.534 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.534 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:36.534 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.534 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.534 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:36.534 13:45:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.534 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.534 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:36.534 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.534 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.534 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:36.534 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:36.817 [2024-07-15 13:45:24.292487] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.817 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.818 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.818 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.818 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.818 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.076 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.076 "name": "Existed_Raid", 00:22:37.076 "uuid": "c8742ea7-5502-460d-bcf7-27caab95b7a3", 00:22:37.076 "strip_size_kb": 0, 00:22:37.076 "state": "online", 00:22:37.076 "raid_level": "raid1", 00:22:37.076 "superblock": true, 00:22:37.076 "num_base_bdevs": 2, 00:22:37.076 "num_base_bdevs_discovered": 1, 00:22:37.076 "num_base_bdevs_operational": 1, 00:22:37.076 "base_bdevs_list": [ 00:22:37.076 { 00:22:37.076 "name": null, 00:22:37.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.076 "is_configured": false, 00:22:37.076 "data_offset": 256, 00:22:37.076 "data_size": 7936 00:22:37.076 }, 00:22:37.076 { 00:22:37.076 "name": "BaseBdev2", 00:22:37.076 "uuid": "1ab4f1fb-7f71-414c-a9d9-07445389badb", 00:22:37.076 "is_configured": true, 00:22:37.076 "data_offset": 256, 00:22:37.076 "data_size": 7936 00:22:37.076 } 00:22:37.076 ] 00:22:37.076 }' 00:22:37.076 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.076 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:37.642 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:37.642 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:37.642 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:37.642 13:45:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.642 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:37.642 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:37.642 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:37.900 [2024-07-15 13:45:25.311102] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:37.900 [2024-07-15 13:45:25.311166] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:37.900 [2024-07-15 13:45:25.321927] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:37.900 [2024-07-15 13:45:25.321954] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:37.900 [2024-07-15 13:45:25.321961] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f4750 name Existed_Raid, state offline 00:22:37.900 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:37.900 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:37.900 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.900 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 92798 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 92798 ']' 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 92798 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 92798 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 92798' 00:22:38.157 killing process with pid 92798 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 92798 00:22:38.157 [2024-07-15 13:45:25.569992] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 92798 00:22:38.157 [2024-07-15 13:45:25.570897] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:22:38.157 00:22:38.157 real 0m8.357s 00:22:38.157 user 0m14.659s 00:22:38.157 sys 0m1.688s 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:38.157 13:45:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:38.157 ************************************ 00:22:38.157 END TEST raid_state_function_test_sb_md_separate 00:22:38.157 ************************************ 00:22:38.415 13:45:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:38.415 13:45:25 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:22:38.416 13:45:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:38.416 13:45:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:38.416 13:45:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:38.416 ************************************ 00:22:38.416 START TEST raid_superblock_test_md_separate 00:22:38.416 ************************************ 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=94120 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 94120 /var/tmp/spdk-raid.sock 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 94120 ']' 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:38.416 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:38.416 13:45:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:38.416 [2024-07-15 13:45:25.921430] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:22:38.416 [2024-07-15 13:45:25.921487] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid94120 ] 00:22:38.416 [2024-07-15 13:45:26.008304] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:38.673 [2024-07-15 13:45:26.096930] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:38.673 [2024-07-15 13:45:26.150661] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:38.673 [2024-07-15 13:45:26.150694] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:39.239 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:22:39.496 malloc1 00:22:39.496 13:45:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:39.496 [2024-07-15 13:45:27.071498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:39.496 [2024-07-15 13:45:27.071540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:39.496 [2024-07-15 13:45:27.071554] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1399c00 00:22:39.496 [2024-07-15 13:45:27.071563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:39.496 [2024-07-15 13:45:27.072565] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:39.496 [2024-07-15 13:45:27.072586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:39.496 pt1 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:39.496 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:22:39.753 malloc2 00:22:39.753 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:40.012 [2024-07-15 13:45:27.428851] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:40.012 [2024-07-15 13:45:27.428889] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.012 [2024-07-15 13:45:27.428917] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1526330 00:22:40.012 [2024-07-15 13:45:27.428925] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.012 [2024-07-15 13:45:27.429956] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.012 [2024-07-15 13:45:27.429979] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:40.012 pt2 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:22:40.012 [2024-07-15 13:45:27.601315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:40.012 [2024-07-15 13:45:27.602140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:40.012 [2024-07-15 13:45:27.602247] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x151b860 00:22:40.012 [2024-07-15 13:45:27.602256] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:40.012 [2024-07-15 13:45:27.602303] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x151c8e0 00:22:40.012 [2024-07-15 13:45:27.602382] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151b860 00:22:40.012 [2024-07-15 13:45:27.602388] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151b860 00:22:40.012 [2024-07-15 13:45:27.602434] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.012 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.269 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.269 "name": "raid_bdev1", 00:22:40.269 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:40.269 "strip_size_kb": 0, 00:22:40.269 "state": "online", 00:22:40.269 "raid_level": "raid1", 00:22:40.269 "superblock": true, 00:22:40.269 "num_base_bdevs": 2, 00:22:40.269 "num_base_bdevs_discovered": 2, 00:22:40.269 "num_base_bdevs_operational": 2, 00:22:40.269 "base_bdevs_list": [ 00:22:40.269 { 00:22:40.269 "name": "pt1", 00:22:40.269 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:40.269 "is_configured": true, 00:22:40.269 "data_offset": 256, 00:22:40.269 "data_size": 7936 00:22:40.269 }, 00:22:40.269 { 00:22:40.269 "name": "pt2", 00:22:40.269 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:40.269 "is_configured": true, 00:22:40.269 "data_offset": 256, 00:22:40.269 "data_size": 7936 00:22:40.269 } 00:22:40.269 ] 00:22:40.269 }' 00:22:40.269 13:45:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.270 13:45:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:40.834 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:40.834 [2024-07-15 13:45:28.443678] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:41.092 "name": "raid_bdev1", 00:22:41.092 "aliases": [ 00:22:41.092 "114cc4a9-8f9a-4116-834f-b5375c7b686e" 00:22:41.092 ], 00:22:41.092 "product_name": "Raid Volume", 00:22:41.092 "block_size": 4096, 00:22:41.092 "num_blocks": 7936, 00:22:41.092 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:41.092 "md_size": 32, 00:22:41.092 "md_interleave": false, 00:22:41.092 "dif_type": 0, 00:22:41.092 "assigned_rate_limits": { 00:22:41.092 "rw_ios_per_sec": 0, 00:22:41.092 "rw_mbytes_per_sec": 0, 00:22:41.092 "r_mbytes_per_sec": 0, 00:22:41.092 "w_mbytes_per_sec": 0 00:22:41.092 }, 00:22:41.092 "claimed": false, 00:22:41.092 "zoned": false, 00:22:41.092 "supported_io_types": { 00:22:41.092 "read": true, 00:22:41.092 "write": true, 00:22:41.092 "unmap": false, 00:22:41.092 "flush": false, 00:22:41.092 "reset": true, 00:22:41.092 "nvme_admin": false, 00:22:41.092 "nvme_io": false, 00:22:41.092 "nvme_io_md": false, 00:22:41.092 "write_zeroes": true, 00:22:41.092 "zcopy": false, 00:22:41.092 "get_zone_info": false, 00:22:41.092 "zone_management": false, 00:22:41.092 "zone_append": false, 00:22:41.092 "compare": false, 00:22:41.092 "compare_and_write": false, 00:22:41.092 "abort": false, 00:22:41.092 "seek_hole": false, 00:22:41.092 "seek_data": false, 00:22:41.092 "copy": false, 00:22:41.092 "nvme_iov_md": false 00:22:41.092 }, 00:22:41.092 "memory_domains": [ 00:22:41.092 { 00:22:41.092 "dma_device_id": "system", 00:22:41.092 "dma_device_type": 1 00:22:41.092 }, 00:22:41.092 { 00:22:41.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.092 "dma_device_type": 2 00:22:41.092 }, 00:22:41.092 { 00:22:41.092 "dma_device_id": "system", 00:22:41.092 "dma_device_type": 1 00:22:41.092 }, 00:22:41.092 { 00:22:41.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.092 "dma_device_type": 2 00:22:41.092 } 00:22:41.092 ], 00:22:41.092 "driver_specific": { 00:22:41.092 "raid": { 00:22:41.092 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:41.092 "strip_size_kb": 0, 00:22:41.092 "state": "online", 00:22:41.092 "raid_level": "raid1", 00:22:41.092 "superblock": true, 00:22:41.092 "num_base_bdevs": 2, 00:22:41.092 "num_base_bdevs_discovered": 2, 00:22:41.092 "num_base_bdevs_operational": 2, 00:22:41.092 "base_bdevs_list": [ 00:22:41.092 { 00:22:41.092 "name": "pt1", 00:22:41.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:41.092 "is_configured": true, 00:22:41.092 "data_offset": 256, 00:22:41.092 "data_size": 7936 00:22:41.092 }, 00:22:41.092 { 00:22:41.092 "name": "pt2", 00:22:41.092 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:41.092 "is_configured": true, 00:22:41.092 "data_offset": 256, 00:22:41.092 "data_size": 7936 00:22:41.092 } 00:22:41.092 ] 00:22:41.092 } 00:22:41.092 } 00:22:41.092 }' 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:41.092 pt2' 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.092 "name": "pt1", 00:22:41.092 "aliases": [ 00:22:41.092 "00000000-0000-0000-0000-000000000001" 00:22:41.092 ], 00:22:41.092 "product_name": "passthru", 00:22:41.092 "block_size": 4096, 00:22:41.092 "num_blocks": 8192, 00:22:41.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:41.092 "md_size": 32, 00:22:41.092 "md_interleave": false, 00:22:41.092 "dif_type": 0, 00:22:41.092 "assigned_rate_limits": { 00:22:41.092 "rw_ios_per_sec": 0, 00:22:41.092 "rw_mbytes_per_sec": 0, 00:22:41.092 "r_mbytes_per_sec": 0, 00:22:41.092 "w_mbytes_per_sec": 0 00:22:41.092 }, 00:22:41.092 "claimed": true, 00:22:41.092 "claim_type": "exclusive_write", 00:22:41.092 "zoned": false, 00:22:41.092 "supported_io_types": { 00:22:41.092 "read": true, 00:22:41.092 "write": true, 00:22:41.092 "unmap": true, 00:22:41.092 "flush": true, 00:22:41.092 "reset": true, 00:22:41.092 "nvme_admin": false, 00:22:41.092 "nvme_io": false, 00:22:41.092 "nvme_io_md": false, 00:22:41.092 "write_zeroes": true, 00:22:41.092 "zcopy": true, 00:22:41.092 "get_zone_info": false, 00:22:41.092 "zone_management": false, 00:22:41.092 "zone_append": false, 00:22:41.092 "compare": false, 00:22:41.092 "compare_and_write": false, 00:22:41.092 "abort": true, 00:22:41.092 "seek_hole": false, 00:22:41.092 "seek_data": false, 00:22:41.092 "copy": true, 00:22:41.092 "nvme_iov_md": false 00:22:41.092 }, 00:22:41.092 "memory_domains": [ 00:22:41.092 { 00:22:41.092 "dma_device_id": "system", 00:22:41.092 "dma_device_type": 1 00:22:41.092 }, 00:22:41.092 { 00:22:41.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.092 "dma_device_type": 2 00:22:41.092 } 00:22:41.092 ], 00:22:41.092 "driver_specific": { 00:22:41.092 "passthru": { 00:22:41.092 "name": "pt1", 00:22:41.092 "base_bdev_name": "malloc1" 00:22:41.092 } 00:22:41.092 } 00:22:41.092 }' 00:22:41.092 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.350 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.607 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:41.607 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.607 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:41.607 13:45:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:41.607 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.607 "name": "pt2", 00:22:41.607 "aliases": [ 00:22:41.607 "00000000-0000-0000-0000-000000000002" 00:22:41.607 ], 00:22:41.607 "product_name": "passthru", 00:22:41.607 "block_size": 4096, 00:22:41.607 "num_blocks": 8192, 00:22:41.607 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:41.607 "md_size": 32, 00:22:41.607 "md_interleave": false, 00:22:41.607 "dif_type": 0, 00:22:41.607 "assigned_rate_limits": { 00:22:41.607 "rw_ios_per_sec": 0, 00:22:41.607 "rw_mbytes_per_sec": 0, 00:22:41.607 "r_mbytes_per_sec": 0, 00:22:41.607 "w_mbytes_per_sec": 0 00:22:41.607 }, 00:22:41.607 "claimed": true, 00:22:41.607 "claim_type": "exclusive_write", 00:22:41.607 "zoned": false, 00:22:41.607 "supported_io_types": { 00:22:41.607 "read": true, 00:22:41.607 "write": true, 00:22:41.607 "unmap": true, 00:22:41.607 "flush": true, 00:22:41.607 "reset": true, 00:22:41.607 "nvme_admin": false, 00:22:41.607 "nvme_io": false, 00:22:41.607 "nvme_io_md": false, 00:22:41.607 "write_zeroes": true, 00:22:41.607 "zcopy": true, 00:22:41.607 "get_zone_info": false, 00:22:41.607 "zone_management": false, 00:22:41.607 "zone_append": false, 00:22:41.607 "compare": false, 00:22:41.607 "compare_and_write": false, 00:22:41.607 "abort": true, 00:22:41.607 "seek_hole": false, 00:22:41.607 "seek_data": false, 00:22:41.607 "copy": true, 00:22:41.607 "nvme_iov_md": false 00:22:41.607 }, 00:22:41.607 "memory_domains": [ 00:22:41.607 { 00:22:41.607 "dma_device_id": "system", 00:22:41.607 "dma_device_type": 1 00:22:41.607 }, 00:22:41.607 { 00:22:41.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.607 "dma_device_type": 2 00:22:41.607 } 00:22:41.607 ], 00:22:41.607 "driver_specific": { 00:22:41.607 "passthru": { 00:22:41.607 "name": "pt2", 00:22:41.607 "base_bdev_name": "malloc2" 00:22:41.607 } 00:22:41.607 } 00:22:41.607 }' 00:22:41.607 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.607 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:41.865 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:42.122 [2024-07-15 13:45:29.602615] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:42.122 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=114cc4a9-8f9a-4116-834f-b5375c7b686e 00:22:42.122 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 114cc4a9-8f9a-4116-834f-b5375c7b686e ']' 00:22:42.122 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:42.380 [2024-07-15 13:45:29.766874] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:42.380 [2024-07-15 13:45:29.766891] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:42.380 [2024-07-15 13:45:29.766934] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:42.380 [2024-07-15 13:45:29.766973] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:42.380 [2024-07-15 13:45:29.766981] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151b860 name raid_bdev1, state offline 00:22:42.380 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.380 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:42.380 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:42.380 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:42.380 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:42.380 13:45:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:42.638 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:42.638 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:42.895 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:43.153 [2024-07-15 13:45:30.673191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:43.153 [2024-07-15 13:45:30.674197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:43.153 [2024-07-15 13:45:30.674244] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:43.153 [2024-07-15 13:45:30.674275] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:43.153 [2024-07-15 13:45:30.674288] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:43.153 [2024-07-15 13:45:30.674295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151bae0 name raid_bdev1, state configuring 00:22:43.153 request: 00:22:43.153 { 00:22:43.153 "name": "raid_bdev1", 00:22:43.153 "raid_level": "raid1", 00:22:43.153 "base_bdevs": [ 00:22:43.153 "malloc1", 00:22:43.153 "malloc2" 00:22:43.153 ], 00:22:43.153 "superblock": false, 00:22:43.153 "method": "bdev_raid_create", 00:22:43.153 "req_id": 1 00:22:43.153 } 00:22:43.153 Got JSON-RPC error response 00:22:43.153 response: 00:22:43.153 { 00:22:43.153 "code": -17, 00:22:43.153 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:43.153 } 00:22:43.153 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:22:43.153 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:43.153 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:43.153 13:45:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:43.153 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.153 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:43.412 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:43.412 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:43.412 13:45:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:43.412 [2024-07-15 13:45:31.022079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:43.412 [2024-07-15 13:45:31.022121] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:43.412 [2024-07-15 13:45:31.022141] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1399e30 00:22:43.412 [2024-07-15 13:45:31.022149] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:43.412 [2024-07-15 13:45:31.023342] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:43.412 [2024-07-15 13:45:31.023366] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:43.412 [2024-07-15 13:45:31.023407] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:43.412 [2024-07-15 13:45:31.023433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:43.412 pt1 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:43.669 "name": "raid_bdev1", 00:22:43.669 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:43.669 "strip_size_kb": 0, 00:22:43.669 "state": "configuring", 00:22:43.669 "raid_level": "raid1", 00:22:43.669 "superblock": true, 00:22:43.669 "num_base_bdevs": 2, 00:22:43.669 "num_base_bdevs_discovered": 1, 00:22:43.669 "num_base_bdevs_operational": 2, 00:22:43.669 "base_bdevs_list": [ 00:22:43.669 { 00:22:43.669 "name": "pt1", 00:22:43.669 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:43.669 "is_configured": true, 00:22:43.669 "data_offset": 256, 00:22:43.669 "data_size": 7936 00:22:43.669 }, 00:22:43.669 { 00:22:43.669 "name": null, 00:22:43.669 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:43.669 "is_configured": false, 00:22:43.669 "data_offset": 256, 00:22:43.669 "data_size": 7936 00:22:43.669 } 00:22:43.669 ] 00:22:43.669 }' 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:43.669 13:45:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:44.234 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:22:44.234 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:44.234 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:44.234 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:44.234 [2024-07-15 13:45:31.848211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:44.234 [2024-07-15 13:45:31.848259] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:44.234 [2024-07-15 13:45:31.848275] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151d2f0 00:22:44.234 [2024-07-15 13:45:31.848283] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:44.234 [2024-07-15 13:45:31.848436] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:44.234 [2024-07-15 13:45:31.848452] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:44.234 [2024-07-15 13:45:31.848484] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:44.234 [2024-07-15 13:45:31.848498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:44.234 [2024-07-15 13:45:31.848568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13986c0 00:22:44.234 [2024-07-15 13:45:31.848575] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:44.234 [2024-07-15 13:45:31.848616] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x151e8e0 00:22:44.234 [2024-07-15 13:45:31.848685] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13986c0 00:22:44.234 [2024-07-15 13:45:31.848691] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13986c0 00:22:44.234 [2024-07-15 13:45:31.848741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:44.493 pt2 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.493 13:45:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.493 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.493 "name": "raid_bdev1", 00:22:44.493 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:44.493 "strip_size_kb": 0, 00:22:44.493 "state": "online", 00:22:44.493 "raid_level": "raid1", 00:22:44.493 "superblock": true, 00:22:44.493 "num_base_bdevs": 2, 00:22:44.493 "num_base_bdevs_discovered": 2, 00:22:44.493 "num_base_bdevs_operational": 2, 00:22:44.493 "base_bdevs_list": [ 00:22:44.493 { 00:22:44.493 "name": "pt1", 00:22:44.493 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:44.493 "is_configured": true, 00:22:44.493 "data_offset": 256, 00:22:44.493 "data_size": 7936 00:22:44.493 }, 00:22:44.493 { 00:22:44.493 "name": "pt2", 00:22:44.493 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:44.493 "is_configured": true, 00:22:44.493 "data_offset": 256, 00:22:44.493 "data_size": 7936 00:22:44.493 } 00:22:44.493 ] 00:22:44.493 }' 00:22:44.493 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.493 13:45:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:45.060 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:45.318 [2024-07-15 13:45:32.710623] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:45.318 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:45.318 "name": "raid_bdev1", 00:22:45.318 "aliases": [ 00:22:45.318 "114cc4a9-8f9a-4116-834f-b5375c7b686e" 00:22:45.318 ], 00:22:45.318 "product_name": "Raid Volume", 00:22:45.318 "block_size": 4096, 00:22:45.318 "num_blocks": 7936, 00:22:45.318 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:45.318 "md_size": 32, 00:22:45.318 "md_interleave": false, 00:22:45.318 "dif_type": 0, 00:22:45.318 "assigned_rate_limits": { 00:22:45.318 "rw_ios_per_sec": 0, 00:22:45.318 "rw_mbytes_per_sec": 0, 00:22:45.318 "r_mbytes_per_sec": 0, 00:22:45.318 "w_mbytes_per_sec": 0 00:22:45.318 }, 00:22:45.318 "claimed": false, 00:22:45.318 "zoned": false, 00:22:45.318 "supported_io_types": { 00:22:45.318 "read": true, 00:22:45.318 "write": true, 00:22:45.318 "unmap": false, 00:22:45.318 "flush": false, 00:22:45.318 "reset": true, 00:22:45.318 "nvme_admin": false, 00:22:45.318 "nvme_io": false, 00:22:45.318 "nvme_io_md": false, 00:22:45.318 "write_zeroes": true, 00:22:45.318 "zcopy": false, 00:22:45.318 "get_zone_info": false, 00:22:45.318 "zone_management": false, 00:22:45.318 "zone_append": false, 00:22:45.318 "compare": false, 00:22:45.318 "compare_and_write": false, 00:22:45.318 "abort": false, 00:22:45.318 "seek_hole": false, 00:22:45.318 "seek_data": false, 00:22:45.318 "copy": false, 00:22:45.318 "nvme_iov_md": false 00:22:45.318 }, 00:22:45.318 "memory_domains": [ 00:22:45.318 { 00:22:45.318 "dma_device_id": "system", 00:22:45.318 "dma_device_type": 1 00:22:45.318 }, 00:22:45.318 { 00:22:45.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.318 "dma_device_type": 2 00:22:45.318 }, 00:22:45.318 { 00:22:45.318 "dma_device_id": "system", 00:22:45.318 "dma_device_type": 1 00:22:45.318 }, 00:22:45.318 { 00:22:45.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.318 "dma_device_type": 2 00:22:45.318 } 00:22:45.318 ], 00:22:45.318 "driver_specific": { 00:22:45.318 "raid": { 00:22:45.318 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:45.318 "strip_size_kb": 0, 00:22:45.318 "state": "online", 00:22:45.318 "raid_level": "raid1", 00:22:45.318 "superblock": true, 00:22:45.318 "num_base_bdevs": 2, 00:22:45.318 "num_base_bdevs_discovered": 2, 00:22:45.318 "num_base_bdevs_operational": 2, 00:22:45.318 "base_bdevs_list": [ 00:22:45.318 { 00:22:45.318 "name": "pt1", 00:22:45.318 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:45.318 "is_configured": true, 00:22:45.318 "data_offset": 256, 00:22:45.318 "data_size": 7936 00:22:45.318 }, 00:22:45.318 { 00:22:45.318 "name": "pt2", 00:22:45.318 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:45.318 "is_configured": true, 00:22:45.318 "data_offset": 256, 00:22:45.318 "data_size": 7936 00:22:45.318 } 00:22:45.318 ] 00:22:45.318 } 00:22:45.318 } 00:22:45.318 }' 00:22:45.318 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:45.318 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:45.318 pt2' 00:22:45.318 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.318 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:45.318 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.576 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.576 "name": "pt1", 00:22:45.576 "aliases": [ 00:22:45.576 "00000000-0000-0000-0000-000000000001" 00:22:45.576 ], 00:22:45.576 "product_name": "passthru", 00:22:45.576 "block_size": 4096, 00:22:45.576 "num_blocks": 8192, 00:22:45.576 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:45.576 "md_size": 32, 00:22:45.576 "md_interleave": false, 00:22:45.576 "dif_type": 0, 00:22:45.576 "assigned_rate_limits": { 00:22:45.576 "rw_ios_per_sec": 0, 00:22:45.576 "rw_mbytes_per_sec": 0, 00:22:45.576 "r_mbytes_per_sec": 0, 00:22:45.576 "w_mbytes_per_sec": 0 00:22:45.576 }, 00:22:45.576 "claimed": true, 00:22:45.576 "claim_type": "exclusive_write", 00:22:45.576 "zoned": false, 00:22:45.576 "supported_io_types": { 00:22:45.576 "read": true, 00:22:45.576 "write": true, 00:22:45.576 "unmap": true, 00:22:45.576 "flush": true, 00:22:45.576 "reset": true, 00:22:45.576 "nvme_admin": false, 00:22:45.576 "nvme_io": false, 00:22:45.576 "nvme_io_md": false, 00:22:45.576 "write_zeroes": true, 00:22:45.576 "zcopy": true, 00:22:45.576 "get_zone_info": false, 00:22:45.576 "zone_management": false, 00:22:45.576 "zone_append": false, 00:22:45.576 "compare": false, 00:22:45.576 "compare_and_write": false, 00:22:45.576 "abort": true, 00:22:45.576 "seek_hole": false, 00:22:45.576 "seek_data": false, 00:22:45.576 "copy": true, 00:22:45.576 "nvme_iov_md": false 00:22:45.576 }, 00:22:45.576 "memory_domains": [ 00:22:45.576 { 00:22:45.576 "dma_device_id": "system", 00:22:45.576 "dma_device_type": 1 00:22:45.576 }, 00:22:45.576 { 00:22:45.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.576 "dma_device_type": 2 00:22:45.576 } 00:22:45.576 ], 00:22:45.576 "driver_specific": { 00:22:45.576 "passthru": { 00:22:45.576 "name": "pt1", 00:22:45.576 "base_bdev_name": "malloc1" 00:22:45.576 } 00:22:45.576 } 00:22:45.576 }' 00:22:45.576 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.576 13:45:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:45.576 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.840 "name": "pt2", 00:22:45.840 "aliases": [ 00:22:45.840 "00000000-0000-0000-0000-000000000002" 00:22:45.840 ], 00:22:45.840 "product_name": "passthru", 00:22:45.840 "block_size": 4096, 00:22:45.840 "num_blocks": 8192, 00:22:45.840 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:45.840 "md_size": 32, 00:22:45.840 "md_interleave": false, 00:22:45.840 "dif_type": 0, 00:22:45.840 "assigned_rate_limits": { 00:22:45.840 "rw_ios_per_sec": 0, 00:22:45.840 "rw_mbytes_per_sec": 0, 00:22:45.840 "r_mbytes_per_sec": 0, 00:22:45.840 "w_mbytes_per_sec": 0 00:22:45.840 }, 00:22:45.840 "claimed": true, 00:22:45.840 "claim_type": "exclusive_write", 00:22:45.840 "zoned": false, 00:22:45.840 "supported_io_types": { 00:22:45.840 "read": true, 00:22:45.840 "write": true, 00:22:45.840 "unmap": true, 00:22:45.840 "flush": true, 00:22:45.840 "reset": true, 00:22:45.840 "nvme_admin": false, 00:22:45.840 "nvme_io": false, 00:22:45.840 "nvme_io_md": false, 00:22:45.840 "write_zeroes": true, 00:22:45.840 "zcopy": true, 00:22:45.840 "get_zone_info": false, 00:22:45.840 "zone_management": false, 00:22:45.840 "zone_append": false, 00:22:45.840 "compare": false, 00:22:45.840 "compare_and_write": false, 00:22:45.840 "abort": true, 00:22:45.840 "seek_hole": false, 00:22:45.840 "seek_data": false, 00:22:45.840 "copy": true, 00:22:45.840 "nvme_iov_md": false 00:22:45.840 }, 00:22:45.840 "memory_domains": [ 00:22:45.840 { 00:22:45.840 "dma_device_id": "system", 00:22:45.840 "dma_device_type": 1 00:22:45.840 }, 00:22:45.840 { 00:22:45.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.840 "dma_device_type": 2 00:22:45.840 } 00:22:45.840 ], 00:22:45.840 "driver_specific": { 00:22:45.840 "passthru": { 00:22:45.840 "name": "pt2", 00:22:45.840 "base_bdev_name": "malloc2" 00:22:45.840 } 00:22:45.840 } 00:22:45.840 }' 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:45.840 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:46.097 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:46.355 [2024-07-15 13:45:33.817470] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:46.355 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 114cc4a9-8f9a-4116-834f-b5375c7b686e '!=' 114cc4a9-8f9a-4116-834f-b5375c7b686e ']' 00:22:46.355 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:46.355 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:46.355 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:22:46.355 13:45:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:46.613 [2024-07-15 13:45:33.997783] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.613 "name": "raid_bdev1", 00:22:46.613 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:46.613 "strip_size_kb": 0, 00:22:46.613 "state": "online", 00:22:46.613 "raid_level": "raid1", 00:22:46.613 "superblock": true, 00:22:46.613 "num_base_bdevs": 2, 00:22:46.613 "num_base_bdevs_discovered": 1, 00:22:46.613 "num_base_bdevs_operational": 1, 00:22:46.613 "base_bdevs_list": [ 00:22:46.613 { 00:22:46.613 "name": null, 00:22:46.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.613 "is_configured": false, 00:22:46.613 "data_offset": 256, 00:22:46.613 "data_size": 7936 00:22:46.613 }, 00:22:46.613 { 00:22:46.613 "name": "pt2", 00:22:46.613 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:46.613 "is_configured": true, 00:22:46.613 "data_offset": 256, 00:22:46.613 "data_size": 7936 00:22:46.613 } 00:22:46.613 ] 00:22:46.613 }' 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.613 13:45:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:47.180 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:47.437 [2024-07-15 13:45:34.855962] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:47.437 [2024-07-15 13:45:34.855987] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:47.437 [2024-07-15 13:45:34.856050] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:47.437 [2024-07-15 13:45:34.856083] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:47.437 [2024-07-15 13:45:34.856091] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13986c0 name raid_bdev1, state offline 00:22:47.437 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.438 13:45:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:47.438 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:47.438 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:47.438 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:47.438 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:47.695 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:47.695 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:47.695 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:47.695 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:47.695 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:47.695 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:22:47.695 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:47.953 [2024-07-15 13:45:35.389324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:47.953 [2024-07-15 13:45:35.389363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.953 [2024-07-15 13:45:35.389377] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1397ba0 00:22:47.953 [2024-07-15 13:45:35.389385] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.953 [2024-07-15 13:45:35.390473] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.953 [2024-07-15 13:45:35.390495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:47.953 [2024-07-15 13:45:35.390535] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:47.953 [2024-07-15 13:45:35.390554] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:47.953 [2024-07-15 13:45:35.390615] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x151dd30 00:22:47.953 [2024-07-15 13:45:35.390621] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:47.953 [2024-07-15 13:45:35.390666] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x151cea0 00:22:47.953 [2024-07-15 13:45:35.390735] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151dd30 00:22:47.953 [2024-07-15 13:45:35.390741] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151dd30 00:22:47.953 [2024-07-15 13:45:35.390788] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:47.953 pt2 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.953 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.211 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.211 "name": "raid_bdev1", 00:22:48.211 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:48.211 "strip_size_kb": 0, 00:22:48.211 "state": "online", 00:22:48.211 "raid_level": "raid1", 00:22:48.211 "superblock": true, 00:22:48.211 "num_base_bdevs": 2, 00:22:48.211 "num_base_bdevs_discovered": 1, 00:22:48.211 "num_base_bdevs_operational": 1, 00:22:48.211 "base_bdevs_list": [ 00:22:48.211 { 00:22:48.211 "name": null, 00:22:48.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.211 "is_configured": false, 00:22:48.211 "data_offset": 256, 00:22:48.211 "data_size": 7936 00:22:48.211 }, 00:22:48.211 { 00:22:48.211 "name": "pt2", 00:22:48.211 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:48.211 "is_configured": true, 00:22:48.211 "data_offset": 256, 00:22:48.211 "data_size": 7936 00:22:48.211 } 00:22:48.211 ] 00:22:48.211 }' 00:22:48.211 13:45:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.211 13:45:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:48.468 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:48.724 [2024-07-15 13:45:36.199414] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:48.724 [2024-07-15 13:45:36.199440] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:48.724 [2024-07-15 13:45:36.199486] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:48.724 [2024-07-15 13:45:36.199521] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:48.724 [2024-07-15 13:45:36.199529] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151dd30 name raid_bdev1, state offline 00:22:48.724 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.724 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:48.980 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:48.980 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:48.980 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:22:48.980 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:48.980 [2024-07-15 13:45:36.568350] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:48.980 [2024-07-15 13:45:36.568390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.980 [2024-07-15 13:45:36.568425] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x139a300 00:22:48.980 [2024-07-15 13:45:36.568433] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.980 [2024-07-15 13:45:36.569518] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.981 [2024-07-15 13:45:36.569540] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:48.981 [2024-07-15 13:45:36.569576] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:48.981 [2024-07-15 13:45:36.569597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:48.981 [2024-07-15 13:45:36.569665] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:48.981 [2024-07-15 13:45:36.569674] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:48.981 [2024-07-15 13:45:36.569685] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151ea70 name raid_bdev1, state configuring 00:22:48.981 [2024-07-15 13:45:36.569702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:48.981 [2024-07-15 13:45:36.569742] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x151ea70 00:22:48.981 [2024-07-15 13:45:36.569749] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:48.981 [2024-07-15 13:45:36.569790] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x151f930 00:22:48.981 [2024-07-15 13:45:36.569859] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151ea70 00:22:48.981 [2024-07-15 13:45:36.569865] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151ea70 00:22:48.981 [2024-07-15 13:45:36.569913] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:48.981 pt1 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.981 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.237 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.237 "name": "raid_bdev1", 00:22:49.237 "uuid": "114cc4a9-8f9a-4116-834f-b5375c7b686e", 00:22:49.237 "strip_size_kb": 0, 00:22:49.237 "state": "online", 00:22:49.237 "raid_level": "raid1", 00:22:49.237 "superblock": true, 00:22:49.237 "num_base_bdevs": 2, 00:22:49.237 "num_base_bdevs_discovered": 1, 00:22:49.237 "num_base_bdevs_operational": 1, 00:22:49.237 "base_bdevs_list": [ 00:22:49.237 { 00:22:49.237 "name": null, 00:22:49.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.237 "is_configured": false, 00:22:49.237 "data_offset": 256, 00:22:49.237 "data_size": 7936 00:22:49.237 }, 00:22:49.237 { 00:22:49.237 "name": "pt2", 00:22:49.237 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:49.237 "is_configured": true, 00:22:49.237 "data_offset": 256, 00:22:49.237 "data_size": 7936 00:22:49.237 } 00:22:49.237 ] 00:22:49.237 }' 00:22:49.237 13:45:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.237 13:45:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:49.800 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:49.800 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:50.057 [2024-07-15 13:45:37.591151] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 114cc4a9-8f9a-4116-834f-b5375c7b686e '!=' 114cc4a9-8f9a-4116-834f-b5375c7b686e ']' 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 94120 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 94120 ']' 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 94120 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 94120 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 94120' 00:22:50.057 killing process with pid 94120 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 94120 00:22:50.057 [2024-07-15 13:45:37.664543] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:50.057 [2024-07-15 13:45:37.664588] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:50.057 [2024-07-15 13:45:37.664622] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:50.057 [2024-07-15 13:45:37.664629] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151ea70 name raid_bdev1, state offline 00:22:50.057 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 94120 00:22:50.314 [2024-07-15 13:45:37.686324] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:50.314 13:45:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:22:50.314 00:22:50.314 real 0m12.006s 00:22:50.314 user 0m21.512s 00:22:50.314 sys 0m2.464s 00:22:50.314 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:50.314 13:45:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:50.314 ************************************ 00:22:50.314 END TEST raid_superblock_test_md_separate 00:22:50.314 ************************************ 00:22:50.314 13:45:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:50.314 13:45:37 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:22:50.314 13:45:37 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:22:50.314 13:45:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:50.314 13:45:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:50.314 13:45:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:50.571 ************************************ 00:22:50.571 START TEST raid_rebuild_test_sb_md_separate 00:22:50.571 ************************************ 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=95996 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 95996 /var/tmp/spdk-raid.sock 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 95996 ']' 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:50.571 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:50.571 13:45:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:50.571 [2024-07-15 13:45:38.016680] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:22:50.571 [2024-07-15 13:45:38.016735] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95996 ] 00:22:50.571 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:50.571 Zero copy mechanism will not be used. 00:22:50.571 [2024-07-15 13:45:38.102322] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:50.828 [2024-07-15 13:45:38.192485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:50.828 [2024-07-15 13:45:38.250324] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:50.828 [2024-07-15 13:45:38.250350] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.393 13:45:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:51.393 13:45:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:22:51.393 13:45:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:51.393 13:45:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:22:51.393 BaseBdev1_malloc 00:22:51.393 13:45:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:51.650 [2024-07-15 13:45:39.160747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:51.650 [2024-07-15 13:45:39.160786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:51.650 [2024-07-15 13:45:39.160805] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fbf90 00:22:51.650 [2024-07-15 13:45:39.160815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:51.650 [2024-07-15 13:45:39.161933] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:51.650 [2024-07-15 13:45:39.161956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:51.650 BaseBdev1 00:22:51.650 13:45:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:51.650 13:45:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:22:51.907 BaseBdev2_malloc 00:22:51.907 13:45:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:51.907 [2024-07-15 13:45:39.510016] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:51.907 [2024-07-15 13:45:39.510054] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:51.907 [2024-07-15 13:45:39.510086] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a53ab0 00:22:51.907 [2024-07-15 13:45:39.510095] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:51.907 [2024-07-15 13:45:39.511125] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:51.907 [2024-07-15 13:45:39.511146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:51.907 BaseBdev2 00:22:51.907 13:45:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:22:52.165 spare_malloc 00:22:52.165 13:45:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:52.422 spare_delay 00:22:52.422 13:45:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:52.422 [2024-07-15 13:45:40.031695] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:52.422 [2024-07-15 13:45:40.031739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.422 [2024-07-15 13:45:40.031761] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a50060 00:22:52.422 [2024-07-15 13:45:40.031773] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.422 [2024-07-15 13:45:40.032861] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.422 [2024-07-15 13:45:40.032884] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:52.422 spare 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:52.678 [2024-07-15 13:45:40.196139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:52.678 [2024-07-15 13:45:40.197117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:52.678 [2024-07-15 13:45:40.197245] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a50a80 00:22:52.678 [2024-07-15 13:45:40.197254] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:52.678 [2024-07-15 13:45:40.197313] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1961c20 00:22:52.678 [2024-07-15 13:45:40.197400] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a50a80 00:22:52.678 [2024-07-15 13:45:40.197407] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a50a80 00:22:52.678 [2024-07-15 13:45:40.197458] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.678 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.934 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.934 "name": "raid_bdev1", 00:22:52.934 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:22:52.934 "strip_size_kb": 0, 00:22:52.934 "state": "online", 00:22:52.934 "raid_level": "raid1", 00:22:52.934 "superblock": true, 00:22:52.934 "num_base_bdevs": 2, 00:22:52.934 "num_base_bdevs_discovered": 2, 00:22:52.934 "num_base_bdevs_operational": 2, 00:22:52.935 "base_bdevs_list": [ 00:22:52.935 { 00:22:52.935 "name": "BaseBdev1", 00:22:52.935 "uuid": "9122410f-7a1e-5988-953c-d04de5b2c043", 00:22:52.935 "is_configured": true, 00:22:52.935 "data_offset": 256, 00:22:52.935 "data_size": 7936 00:22:52.935 }, 00:22:52.935 { 00:22:52.935 "name": "BaseBdev2", 00:22:52.935 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:22:52.935 "is_configured": true, 00:22:52.935 "data_offset": 256, 00:22:52.935 "data_size": 7936 00:22:52.935 } 00:22:52.935 ] 00:22:52.935 }' 00:22:52.935 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.935 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:53.498 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:53.498 13:45:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:53.498 [2024-07-15 13:45:41.062522] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:53.498 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:22:53.498 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.498 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:53.755 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:54.012 [2024-07-15 13:45:41.419299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1961c20 00:22:54.012 /dev/nbd0 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:54.012 1+0 records in 00:22:54.012 1+0 records out 00:22:54.012 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000173521 s, 23.6 MB/s 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:54.012 13:45:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:22:54.574 7936+0 records in 00:22:54.574 7936+0 records out 00:22:54.574 32505856 bytes (33 MB, 31 MiB) copied, 0.520737 s, 62.4 MB/s 00:22:54.575 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:54.575 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:54.575 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:54.575 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:54.575 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:22:54.575 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:54.575 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:54.832 [2024-07-15 13:45:42.197630] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:54.832 [2024-07-15 13:45:42.363320] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.832 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.122 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.122 "name": "raid_bdev1", 00:22:55.122 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:22:55.122 "strip_size_kb": 0, 00:22:55.122 "state": "online", 00:22:55.122 "raid_level": "raid1", 00:22:55.122 "superblock": true, 00:22:55.122 "num_base_bdevs": 2, 00:22:55.122 "num_base_bdevs_discovered": 1, 00:22:55.122 "num_base_bdevs_operational": 1, 00:22:55.122 "base_bdevs_list": [ 00:22:55.122 { 00:22:55.122 "name": null, 00:22:55.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:55.122 "is_configured": false, 00:22:55.122 "data_offset": 256, 00:22:55.122 "data_size": 7936 00:22:55.122 }, 00:22:55.122 { 00:22:55.122 "name": "BaseBdev2", 00:22:55.122 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:22:55.122 "is_configured": true, 00:22:55.122 "data_offset": 256, 00:22:55.122 "data_size": 7936 00:22:55.122 } 00:22:55.122 ] 00:22:55.122 }' 00:22:55.122 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.122 13:45:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:55.708 13:45:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:55.708 [2024-07-15 13:45:43.217516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:55.708 [2024-07-15 13:45:43.219526] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fac10 00:22:55.708 [2024-07-15 13:45:43.221122] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:55.708 13:45:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:56.635 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:56.635 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:56.635 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:56.635 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:56.635 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:56.635 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.635 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.891 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:56.891 "name": "raid_bdev1", 00:22:56.891 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:22:56.891 "strip_size_kb": 0, 00:22:56.891 "state": "online", 00:22:56.891 "raid_level": "raid1", 00:22:56.891 "superblock": true, 00:22:56.891 "num_base_bdevs": 2, 00:22:56.891 "num_base_bdevs_discovered": 2, 00:22:56.891 "num_base_bdevs_operational": 2, 00:22:56.891 "process": { 00:22:56.891 "type": "rebuild", 00:22:56.891 "target": "spare", 00:22:56.891 "progress": { 00:22:56.891 "blocks": 2816, 00:22:56.891 "percent": 35 00:22:56.891 } 00:22:56.891 }, 00:22:56.891 "base_bdevs_list": [ 00:22:56.891 { 00:22:56.891 "name": "spare", 00:22:56.891 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:22:56.891 "is_configured": true, 00:22:56.891 "data_offset": 256, 00:22:56.891 "data_size": 7936 00:22:56.891 }, 00:22:56.891 { 00:22:56.891 "name": "BaseBdev2", 00:22:56.891 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:22:56.891 "is_configured": true, 00:22:56.891 "data_offset": 256, 00:22:56.891 "data_size": 7936 00:22:56.891 } 00:22:56.891 ] 00:22:56.891 }' 00:22:56.891 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:56.891 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:56.891 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.149 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:57.149 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:57.149 [2024-07-15 13:45:44.685860] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:57.149 [2024-07-15 13:45:44.732236] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:57.149 [2024-07-15 13:45:44.732274] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:57.149 [2024-07-15 13:45:44.732285] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:57.149 [2024-07-15 13:45:44.732292] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:57.149 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:57.149 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:57.149 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.150 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.406 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:57.406 "name": "raid_bdev1", 00:22:57.406 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:22:57.406 "strip_size_kb": 0, 00:22:57.406 "state": "online", 00:22:57.406 "raid_level": "raid1", 00:22:57.406 "superblock": true, 00:22:57.406 "num_base_bdevs": 2, 00:22:57.406 "num_base_bdevs_discovered": 1, 00:22:57.406 "num_base_bdevs_operational": 1, 00:22:57.406 "base_bdevs_list": [ 00:22:57.406 { 00:22:57.406 "name": null, 00:22:57.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.406 "is_configured": false, 00:22:57.406 "data_offset": 256, 00:22:57.406 "data_size": 7936 00:22:57.406 }, 00:22:57.406 { 00:22:57.406 "name": "BaseBdev2", 00:22:57.406 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:22:57.406 "is_configured": true, 00:22:57.406 "data_offset": 256, 00:22:57.406 "data_size": 7936 00:22:57.406 } 00:22:57.406 ] 00:22:57.406 }' 00:22:57.406 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:57.406 13:45:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:57.981 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:57.982 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.982 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:57.982 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:57.982 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.982 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.982 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.265 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:58.265 "name": "raid_bdev1", 00:22:58.265 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:22:58.265 "strip_size_kb": 0, 00:22:58.265 "state": "online", 00:22:58.265 "raid_level": "raid1", 00:22:58.265 "superblock": true, 00:22:58.265 "num_base_bdevs": 2, 00:22:58.265 "num_base_bdevs_discovered": 1, 00:22:58.265 "num_base_bdevs_operational": 1, 00:22:58.265 "base_bdevs_list": [ 00:22:58.265 { 00:22:58.265 "name": null, 00:22:58.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.265 "is_configured": false, 00:22:58.265 "data_offset": 256, 00:22:58.265 "data_size": 7936 00:22:58.265 }, 00:22:58.265 { 00:22:58.265 "name": "BaseBdev2", 00:22:58.265 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:22:58.265 "is_configured": true, 00:22:58.265 "data_offset": 256, 00:22:58.265 "data_size": 7936 00:22:58.265 } 00:22:58.265 ] 00:22:58.265 }' 00:22:58.265 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:58.265 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:58.265 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:58.265 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:58.265 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:58.265 [2024-07-15 13:45:45.870452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:58.265 [2024-07-15 13:45:45.872782] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fbb40 00:22:58.265 [2024-07-15 13:45:45.873961] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:58.523 13:45:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:59.454 13:45:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:59.454 13:45:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.454 13:45:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:59.454 13:45:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:59.454 13:45:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.454 13:45:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.454 13:45:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.454 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.454 "name": "raid_bdev1", 00:22:59.454 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:22:59.454 "strip_size_kb": 0, 00:22:59.454 "state": "online", 00:22:59.454 "raid_level": "raid1", 00:22:59.454 "superblock": true, 00:22:59.454 "num_base_bdevs": 2, 00:22:59.454 "num_base_bdevs_discovered": 2, 00:22:59.454 "num_base_bdevs_operational": 2, 00:22:59.454 "process": { 00:22:59.454 "type": "rebuild", 00:22:59.454 "target": "spare", 00:22:59.454 "progress": { 00:22:59.454 "blocks": 2816, 00:22:59.454 "percent": 35 00:22:59.454 } 00:22:59.454 }, 00:22:59.454 "base_bdevs_list": [ 00:22:59.454 { 00:22:59.454 "name": "spare", 00:22:59.454 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:22:59.454 "is_configured": true, 00:22:59.454 "data_offset": 256, 00:22:59.454 "data_size": 7936 00:22:59.454 }, 00:22:59.454 { 00:22:59.454 "name": "BaseBdev2", 00:22:59.454 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:22:59.454 "is_configured": true, 00:22:59.454 "data_offset": 256, 00:22:59.454 "data_size": 7936 00:22:59.454 } 00:22:59.454 ] 00:22:59.454 }' 00:22:59.454 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:59.710 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=847 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.710 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.967 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.967 "name": "raid_bdev1", 00:22:59.967 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:22:59.967 "strip_size_kb": 0, 00:22:59.967 "state": "online", 00:22:59.967 "raid_level": "raid1", 00:22:59.967 "superblock": true, 00:22:59.967 "num_base_bdevs": 2, 00:22:59.967 "num_base_bdevs_discovered": 2, 00:22:59.967 "num_base_bdevs_operational": 2, 00:22:59.967 "process": { 00:22:59.967 "type": "rebuild", 00:22:59.967 "target": "spare", 00:22:59.967 "progress": { 00:22:59.967 "blocks": 3584, 00:22:59.967 "percent": 45 00:22:59.967 } 00:22:59.967 }, 00:22:59.967 "base_bdevs_list": [ 00:22:59.967 { 00:22:59.967 "name": "spare", 00:22:59.967 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:22:59.967 "is_configured": true, 00:22:59.967 "data_offset": 256, 00:22:59.967 "data_size": 7936 00:22:59.967 }, 00:22:59.967 { 00:22:59.967 "name": "BaseBdev2", 00:22:59.967 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:22:59.967 "is_configured": true, 00:22:59.967 "data_offset": 256, 00:22:59.967 "data_size": 7936 00:22:59.967 } 00:22:59.967 ] 00:22:59.967 }' 00:22:59.967 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.967 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:59.967 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.967 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:59.967 13:45:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.897 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.154 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.154 "name": "raid_bdev1", 00:23:01.154 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:01.154 "strip_size_kb": 0, 00:23:01.154 "state": "online", 00:23:01.154 "raid_level": "raid1", 00:23:01.154 "superblock": true, 00:23:01.154 "num_base_bdevs": 2, 00:23:01.154 "num_base_bdevs_discovered": 2, 00:23:01.154 "num_base_bdevs_operational": 2, 00:23:01.154 "process": { 00:23:01.154 "type": "rebuild", 00:23:01.154 "target": "spare", 00:23:01.154 "progress": { 00:23:01.154 "blocks": 6656, 00:23:01.154 "percent": 83 00:23:01.154 } 00:23:01.154 }, 00:23:01.154 "base_bdevs_list": [ 00:23:01.154 { 00:23:01.154 "name": "spare", 00:23:01.154 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:01.154 "is_configured": true, 00:23:01.154 "data_offset": 256, 00:23:01.154 "data_size": 7936 00:23:01.154 }, 00:23:01.154 { 00:23:01.154 "name": "BaseBdev2", 00:23:01.154 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:01.154 "is_configured": true, 00:23:01.154 "data_offset": 256, 00:23:01.154 "data_size": 7936 00:23:01.154 } 00:23:01.154 ] 00:23:01.154 }' 00:23:01.154 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.154 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:01.154 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.154 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:01.154 13:45:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:01.410 [2024-07-15 13:45:48.996836] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:01.410 [2024-07-15 13:45:48.996880] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:01.410 [2024-07-15 13:45:48.996939] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:02.347 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:02.347 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:02.347 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.347 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:02.347 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.348 "name": "raid_bdev1", 00:23:02.348 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:02.348 "strip_size_kb": 0, 00:23:02.348 "state": "online", 00:23:02.348 "raid_level": "raid1", 00:23:02.348 "superblock": true, 00:23:02.348 "num_base_bdevs": 2, 00:23:02.348 "num_base_bdevs_discovered": 2, 00:23:02.348 "num_base_bdevs_operational": 2, 00:23:02.348 "base_bdevs_list": [ 00:23:02.348 { 00:23:02.348 "name": "spare", 00:23:02.348 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:02.348 "is_configured": true, 00:23:02.348 "data_offset": 256, 00:23:02.348 "data_size": 7936 00:23:02.348 }, 00:23:02.348 { 00:23:02.348 "name": "BaseBdev2", 00:23:02.348 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:02.348 "is_configured": true, 00:23:02.348 "data_offset": 256, 00:23:02.348 "data_size": 7936 00:23:02.348 } 00:23:02.348 ] 00:23:02.348 }' 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.348 13:45:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.605 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.605 "name": "raid_bdev1", 00:23:02.605 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:02.605 "strip_size_kb": 0, 00:23:02.605 "state": "online", 00:23:02.605 "raid_level": "raid1", 00:23:02.605 "superblock": true, 00:23:02.605 "num_base_bdevs": 2, 00:23:02.605 "num_base_bdevs_discovered": 2, 00:23:02.605 "num_base_bdevs_operational": 2, 00:23:02.605 "base_bdevs_list": [ 00:23:02.605 { 00:23:02.605 "name": "spare", 00:23:02.605 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:02.605 "is_configured": true, 00:23:02.605 "data_offset": 256, 00:23:02.605 "data_size": 7936 00:23:02.605 }, 00:23:02.605 { 00:23:02.605 "name": "BaseBdev2", 00:23:02.605 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:02.605 "is_configured": true, 00:23:02.605 "data_offset": 256, 00:23:02.605 "data_size": 7936 00:23:02.605 } 00:23:02.605 ] 00:23:02.605 }' 00:23:02.605 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.605 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:02.605 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.862 "name": "raid_bdev1", 00:23:02.862 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:02.862 "strip_size_kb": 0, 00:23:02.862 "state": "online", 00:23:02.862 "raid_level": "raid1", 00:23:02.862 "superblock": true, 00:23:02.862 "num_base_bdevs": 2, 00:23:02.862 "num_base_bdevs_discovered": 2, 00:23:02.862 "num_base_bdevs_operational": 2, 00:23:02.862 "base_bdevs_list": [ 00:23:02.862 { 00:23:02.862 "name": "spare", 00:23:02.862 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:02.862 "is_configured": true, 00:23:02.862 "data_offset": 256, 00:23:02.862 "data_size": 7936 00:23:02.862 }, 00:23:02.862 { 00:23:02.862 "name": "BaseBdev2", 00:23:02.862 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:02.862 "is_configured": true, 00:23:02.862 "data_offset": 256, 00:23:02.862 "data_size": 7936 00:23:02.862 } 00:23:02.862 ] 00:23:02.862 }' 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.862 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:03.426 13:45:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:03.426 [2024-07-15 13:45:51.041215] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:03.426 [2024-07-15 13:45:51.041238] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:03.426 [2024-07-15 13:45:51.041280] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:03.426 [2024-07-15 13:45:51.041338] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:03.426 [2024-07-15 13:45:51.041347] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a50a80 name raid_bdev1, state offline 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:03.685 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:03.942 /dev/nbd0 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:03.942 1+0 records in 00:23:03.942 1+0 records out 00:23:03.942 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294579 s, 13.9 MB/s 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:03.942 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:04.200 /dev/nbd1 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:04.200 1+0 records in 00:23:04.200 1+0 records out 00:23:04.200 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271531 s, 15.1 MB/s 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:04.200 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:04.456 13:45:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:04.712 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:04.969 [2024-07-15 13:45:52.437440] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:04.969 [2024-07-15 13:45:52.437481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.969 [2024-07-15 13:45:52.437497] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a959d0 00:23:04.969 [2024-07-15 13:45:52.437505] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.969 [2024-07-15 13:45:52.438592] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.969 [2024-07-15 13:45:52.438614] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:04.969 [2024-07-15 13:45:52.438659] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:04.969 [2024-07-15 13:45:52.438680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:04.969 [2024-07-15 13:45:52.438750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:04.969 spare 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.969 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.969 [2024-07-15 13:45:52.539040] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18fa6b0 00:23:04.969 [2024-07-15 13:45:52.539056] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:04.969 [2024-07-15 13:45:52.539110] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a51680 00:23:04.969 [2024-07-15 13:45:52.539206] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18fa6b0 00:23:04.969 [2024-07-15 13:45:52.539213] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18fa6b0 00:23:04.969 [2024-07-15 13:45:52.539271] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:05.225 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.225 "name": "raid_bdev1", 00:23:05.225 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:05.225 "strip_size_kb": 0, 00:23:05.225 "state": "online", 00:23:05.225 "raid_level": "raid1", 00:23:05.225 "superblock": true, 00:23:05.225 "num_base_bdevs": 2, 00:23:05.225 "num_base_bdevs_discovered": 2, 00:23:05.225 "num_base_bdevs_operational": 2, 00:23:05.225 "base_bdevs_list": [ 00:23:05.225 { 00:23:05.225 "name": "spare", 00:23:05.225 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:05.225 "is_configured": true, 00:23:05.225 "data_offset": 256, 00:23:05.225 "data_size": 7936 00:23:05.225 }, 00:23:05.225 { 00:23:05.225 "name": "BaseBdev2", 00:23:05.225 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:05.225 "is_configured": true, 00:23:05.225 "data_offset": 256, 00:23:05.225 "data_size": 7936 00:23:05.225 } 00:23:05.225 ] 00:23:05.225 }' 00:23:05.225 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.225 13:45:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.789 "name": "raid_bdev1", 00:23:05.789 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:05.789 "strip_size_kb": 0, 00:23:05.789 "state": "online", 00:23:05.789 "raid_level": "raid1", 00:23:05.789 "superblock": true, 00:23:05.789 "num_base_bdevs": 2, 00:23:05.789 "num_base_bdevs_discovered": 2, 00:23:05.789 "num_base_bdevs_operational": 2, 00:23:05.789 "base_bdevs_list": [ 00:23:05.789 { 00:23:05.789 "name": "spare", 00:23:05.789 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:05.789 "is_configured": true, 00:23:05.789 "data_offset": 256, 00:23:05.789 "data_size": 7936 00:23:05.789 }, 00:23:05.789 { 00:23:05.789 "name": "BaseBdev2", 00:23:05.789 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:05.789 "is_configured": true, 00:23:05.789 "data_offset": 256, 00:23:05.789 "data_size": 7936 00:23:05.789 } 00:23:05.789 ] 00:23:05.789 }' 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.789 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:06.045 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:06.045 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:06.301 [2024-07-15 13:45:53.724815] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.301 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.557 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.557 "name": "raid_bdev1", 00:23:06.557 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:06.557 "strip_size_kb": 0, 00:23:06.557 "state": "online", 00:23:06.557 "raid_level": "raid1", 00:23:06.557 "superblock": true, 00:23:06.557 "num_base_bdevs": 2, 00:23:06.557 "num_base_bdevs_discovered": 1, 00:23:06.557 "num_base_bdevs_operational": 1, 00:23:06.557 "base_bdevs_list": [ 00:23:06.557 { 00:23:06.557 "name": null, 00:23:06.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.557 "is_configured": false, 00:23:06.557 "data_offset": 256, 00:23:06.557 "data_size": 7936 00:23:06.557 }, 00:23:06.557 { 00:23:06.557 "name": "BaseBdev2", 00:23:06.557 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:06.557 "is_configured": true, 00:23:06.557 "data_offset": 256, 00:23:06.557 "data_size": 7936 00:23:06.557 } 00:23:06.557 ] 00:23:06.557 }' 00:23:06.557 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.557 13:45:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:06.814 13:45:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:07.071 [2024-07-15 13:45:54.558983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:07.071 [2024-07-15 13:45:54.559111] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:07.071 [2024-07-15 13:45:54.559128] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:07.071 [2024-07-15 13:45:54.559148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:07.071 [2024-07-15 13:45:54.561109] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a50450 00:23:07.071 [2024-07-15 13:45:54.562089] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:07.071 13:45:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:08.002 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:08.002 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:08.002 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:08.002 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:08.002 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:08.002 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.002 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.259 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:08.259 "name": "raid_bdev1", 00:23:08.259 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:08.259 "strip_size_kb": 0, 00:23:08.259 "state": "online", 00:23:08.259 "raid_level": "raid1", 00:23:08.259 "superblock": true, 00:23:08.259 "num_base_bdevs": 2, 00:23:08.259 "num_base_bdevs_discovered": 2, 00:23:08.259 "num_base_bdevs_operational": 2, 00:23:08.259 "process": { 00:23:08.259 "type": "rebuild", 00:23:08.259 "target": "spare", 00:23:08.259 "progress": { 00:23:08.259 "blocks": 2816, 00:23:08.259 "percent": 35 00:23:08.259 } 00:23:08.259 }, 00:23:08.259 "base_bdevs_list": [ 00:23:08.259 { 00:23:08.259 "name": "spare", 00:23:08.259 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:08.259 "is_configured": true, 00:23:08.259 "data_offset": 256, 00:23:08.259 "data_size": 7936 00:23:08.259 }, 00:23:08.259 { 00:23:08.259 "name": "BaseBdev2", 00:23:08.259 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:08.259 "is_configured": true, 00:23:08.259 "data_offset": 256, 00:23:08.259 "data_size": 7936 00:23:08.259 } 00:23:08.259 ] 00:23:08.259 }' 00:23:08.259 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:08.259 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:08.259 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:08.259 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:08.259 13:45:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:08.516 [2024-07-15 13:45:56.015506] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:08.516 [2024-07-15 13:45:56.073331] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:08.516 [2024-07-15 13:45:56.073365] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.516 [2024-07-15 13:45:56.073375] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:08.516 [2024-07-15 13:45:56.073380] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.516 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.772 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.772 "name": "raid_bdev1", 00:23:08.772 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:08.772 "strip_size_kb": 0, 00:23:08.772 "state": "online", 00:23:08.772 "raid_level": "raid1", 00:23:08.772 "superblock": true, 00:23:08.772 "num_base_bdevs": 2, 00:23:08.772 "num_base_bdevs_discovered": 1, 00:23:08.772 "num_base_bdevs_operational": 1, 00:23:08.772 "base_bdevs_list": [ 00:23:08.772 { 00:23:08.772 "name": null, 00:23:08.772 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.772 "is_configured": false, 00:23:08.772 "data_offset": 256, 00:23:08.772 "data_size": 7936 00:23:08.772 }, 00:23:08.772 { 00:23:08.772 "name": "BaseBdev2", 00:23:08.772 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:08.772 "is_configured": true, 00:23:08.772 "data_offset": 256, 00:23:08.772 "data_size": 7936 00:23:08.773 } 00:23:08.773 ] 00:23:08.773 }' 00:23:08.773 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.773 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:09.337 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:09.337 [2024-07-15 13:45:56.914625] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:09.337 [2024-07-15 13:45:56.914669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.337 [2024-07-15 13:45:56.914686] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fa930 00:23:09.337 [2024-07-15 13:45:56.914696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.337 [2024-07-15 13:45:56.914875] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.337 [2024-07-15 13:45:56.914887] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:09.337 [2024-07-15 13:45:56.914933] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:09.337 [2024-07-15 13:45:56.914941] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:09.337 [2024-07-15 13:45:56.914949] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:09.337 [2024-07-15 13:45:56.914962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:09.337 [2024-07-15 13:45:56.916926] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fbb40 00:23:09.337 [2024-07-15 13:45:56.917924] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:09.337 spare 00:23:09.337 13:45:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:10.708 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:10.708 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.708 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:10.708 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:10.708 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.708 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.708 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.708 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:10.708 "name": "raid_bdev1", 00:23:10.708 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:10.708 "strip_size_kb": 0, 00:23:10.708 "state": "online", 00:23:10.708 "raid_level": "raid1", 00:23:10.708 "superblock": true, 00:23:10.708 "num_base_bdevs": 2, 00:23:10.708 "num_base_bdevs_discovered": 2, 00:23:10.708 "num_base_bdevs_operational": 2, 00:23:10.708 "process": { 00:23:10.708 "type": "rebuild", 00:23:10.708 "target": "spare", 00:23:10.708 "progress": { 00:23:10.708 "blocks": 2816, 00:23:10.708 "percent": 35 00:23:10.708 } 00:23:10.708 }, 00:23:10.708 "base_bdevs_list": [ 00:23:10.708 { 00:23:10.708 "name": "spare", 00:23:10.708 "uuid": "c4e40b99-e632-529f-ba18-0ddf53a58b70", 00:23:10.708 "is_configured": true, 00:23:10.708 "data_offset": 256, 00:23:10.708 "data_size": 7936 00:23:10.708 }, 00:23:10.708 { 00:23:10.708 "name": "BaseBdev2", 00:23:10.708 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:10.708 "is_configured": true, 00:23:10.708 "data_offset": 256, 00:23:10.708 "data_size": 7936 00:23:10.708 } 00:23:10.708 ] 00:23:10.708 }' 00:23:10.708 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:10.708 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:10.708 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:10.708 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:10.708 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:10.965 [2024-07-15 13:45:58.351334] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:10.965 [2024-07-15 13:45:58.428930] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:10.965 [2024-07-15 13:45:58.428965] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:10.965 [2024-07-15 13:45:58.428992] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:10.965 [2024-07-15 13:45:58.429004] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.965 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.220 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.220 "name": "raid_bdev1", 00:23:11.220 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:11.220 "strip_size_kb": 0, 00:23:11.221 "state": "online", 00:23:11.221 "raid_level": "raid1", 00:23:11.221 "superblock": true, 00:23:11.221 "num_base_bdevs": 2, 00:23:11.221 "num_base_bdevs_discovered": 1, 00:23:11.221 "num_base_bdevs_operational": 1, 00:23:11.221 "base_bdevs_list": [ 00:23:11.221 { 00:23:11.221 "name": null, 00:23:11.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.221 "is_configured": false, 00:23:11.221 "data_offset": 256, 00:23:11.221 "data_size": 7936 00:23:11.221 }, 00:23:11.221 { 00:23:11.221 "name": "BaseBdev2", 00:23:11.221 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:11.221 "is_configured": true, 00:23:11.221 "data_offset": 256, 00:23:11.221 "data_size": 7936 00:23:11.221 } 00:23:11.221 ] 00:23:11.221 }' 00:23:11.221 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.221 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:11.786 "name": "raid_bdev1", 00:23:11.786 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:11.786 "strip_size_kb": 0, 00:23:11.786 "state": "online", 00:23:11.786 "raid_level": "raid1", 00:23:11.786 "superblock": true, 00:23:11.786 "num_base_bdevs": 2, 00:23:11.786 "num_base_bdevs_discovered": 1, 00:23:11.786 "num_base_bdevs_operational": 1, 00:23:11.786 "base_bdevs_list": [ 00:23:11.786 { 00:23:11.786 "name": null, 00:23:11.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.786 "is_configured": false, 00:23:11.786 "data_offset": 256, 00:23:11.786 "data_size": 7936 00:23:11.786 }, 00:23:11.786 { 00:23:11.786 "name": "BaseBdev2", 00:23:11.786 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:11.786 "is_configured": true, 00:23:11.786 "data_offset": 256, 00:23:11.786 "data_size": 7936 00:23:11.786 } 00:23:11.786 ] 00:23:11.786 }' 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:11.786 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:12.044 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:12.302 [2024-07-15 13:45:59.735697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:12.302 [2024-07-15 13:45:59.735738] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:12.302 [2024-07-15 13:45:59.735770] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fc1c0 00:23:12.302 [2024-07-15 13:45:59.735779] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:12.302 [2024-07-15 13:45:59.735936] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:12.302 [2024-07-15 13:45:59.735947] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:12.302 [2024-07-15 13:45:59.735985] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:12.302 [2024-07-15 13:45:59.735993] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:12.302 [2024-07-15 13:45:59.736010] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:12.302 BaseBdev1 00:23:12.302 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.236 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.494 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.494 "name": "raid_bdev1", 00:23:13.494 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:13.494 "strip_size_kb": 0, 00:23:13.494 "state": "online", 00:23:13.494 "raid_level": "raid1", 00:23:13.494 "superblock": true, 00:23:13.494 "num_base_bdevs": 2, 00:23:13.494 "num_base_bdevs_discovered": 1, 00:23:13.494 "num_base_bdevs_operational": 1, 00:23:13.494 "base_bdevs_list": [ 00:23:13.494 { 00:23:13.494 "name": null, 00:23:13.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.494 "is_configured": false, 00:23:13.494 "data_offset": 256, 00:23:13.494 "data_size": 7936 00:23:13.494 }, 00:23:13.494 { 00:23:13.494 "name": "BaseBdev2", 00:23:13.494 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:13.494 "is_configured": true, 00:23:13.494 "data_offset": 256, 00:23:13.494 "data_size": 7936 00:23:13.494 } 00:23:13.494 ] 00:23:13.494 }' 00:23:13.494 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.494 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.061 "name": "raid_bdev1", 00:23:14.061 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:14.061 "strip_size_kb": 0, 00:23:14.061 "state": "online", 00:23:14.061 "raid_level": "raid1", 00:23:14.061 "superblock": true, 00:23:14.061 "num_base_bdevs": 2, 00:23:14.061 "num_base_bdevs_discovered": 1, 00:23:14.061 "num_base_bdevs_operational": 1, 00:23:14.061 "base_bdevs_list": [ 00:23:14.061 { 00:23:14.061 "name": null, 00:23:14.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.061 "is_configured": false, 00:23:14.061 "data_offset": 256, 00:23:14.061 "data_size": 7936 00:23:14.061 }, 00:23:14.061 { 00:23:14.061 "name": "BaseBdev2", 00:23:14.061 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:14.061 "is_configured": true, 00:23:14.061 "data_offset": 256, 00:23:14.061 "data_size": 7936 00:23:14.061 } 00:23:14.061 ] 00:23:14.061 }' 00:23:14.061 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:14.319 [2024-07-15 13:46:01.889276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:14.319 [2024-07-15 13:46:01.889385] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:14.319 [2024-07-15 13:46:01.889396] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:14.319 request: 00:23:14.319 { 00:23:14.319 "base_bdev": "BaseBdev1", 00:23:14.319 "raid_bdev": "raid_bdev1", 00:23:14.319 "method": "bdev_raid_add_base_bdev", 00:23:14.319 "req_id": 1 00:23:14.319 } 00:23:14.319 Got JSON-RPC error response 00:23:14.319 response: 00:23:14.319 { 00:23:14.319 "code": -22, 00:23:14.319 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:14.319 } 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:14.319 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.691 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.691 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.691 "name": "raid_bdev1", 00:23:15.691 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:15.691 "strip_size_kb": 0, 00:23:15.691 "state": "online", 00:23:15.691 "raid_level": "raid1", 00:23:15.691 "superblock": true, 00:23:15.691 "num_base_bdevs": 2, 00:23:15.691 "num_base_bdevs_discovered": 1, 00:23:15.691 "num_base_bdevs_operational": 1, 00:23:15.691 "base_bdevs_list": [ 00:23:15.691 { 00:23:15.691 "name": null, 00:23:15.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.691 "is_configured": false, 00:23:15.691 "data_offset": 256, 00:23:15.691 "data_size": 7936 00:23:15.691 }, 00:23:15.691 { 00:23:15.691 "name": "BaseBdev2", 00:23:15.691 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:15.691 "is_configured": true, 00:23:15.691 "data_offset": 256, 00:23:15.691 "data_size": 7936 00:23:15.691 } 00:23:15.691 ] 00:23:15.691 }' 00:23:15.691 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.691 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.254 "name": "raid_bdev1", 00:23:16.254 "uuid": "41fa255c-9a71-431e-80ed-bcad0f14be78", 00:23:16.254 "strip_size_kb": 0, 00:23:16.254 "state": "online", 00:23:16.254 "raid_level": "raid1", 00:23:16.254 "superblock": true, 00:23:16.254 "num_base_bdevs": 2, 00:23:16.254 "num_base_bdevs_discovered": 1, 00:23:16.254 "num_base_bdevs_operational": 1, 00:23:16.254 "base_bdevs_list": [ 00:23:16.254 { 00:23:16.254 "name": null, 00:23:16.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.254 "is_configured": false, 00:23:16.254 "data_offset": 256, 00:23:16.254 "data_size": 7936 00:23:16.254 }, 00:23:16.254 { 00:23:16.254 "name": "BaseBdev2", 00:23:16.254 "uuid": "9084ee5e-556a-576c-b9de-a0cd5537ca9b", 00:23:16.254 "is_configured": true, 00:23:16.254 "data_offset": 256, 00:23:16.254 "data_size": 7936 00:23:16.254 } 00:23:16.254 ] 00:23:16.254 }' 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 95996 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 95996 ']' 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 95996 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 95996 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 95996' 00:23:16.254 killing process with pid 95996 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 95996 00:23:16.254 Received shutdown signal, test time was about 60.000000 seconds 00:23:16.254 00:23:16.254 Latency(us) 00:23:16.254 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:16.254 =================================================================================================================== 00:23:16.254 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:16.254 [2024-07-15 13:46:03.863725] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:16.254 [2024-07-15 13:46:03.863795] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:16.254 [2024-07-15 13:46:03.863828] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:16.254 [2024-07-15 13:46:03.863837] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18fa6b0 name raid_bdev1, state offline 00:23:16.254 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 95996 00:23:16.511 [2024-07-15 13:46:03.901428] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:16.511 13:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:23:16.511 00:23:16.511 real 0m26.152s 00:23:16.511 user 0m39.442s 00:23:16.511 sys 0m4.273s 00:23:16.511 13:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:16.511 13:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:16.511 ************************************ 00:23:16.511 END TEST raid_rebuild_test_sb_md_separate 00:23:16.511 ************************************ 00:23:16.769 13:46:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:16.769 13:46:04 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:23:16.769 13:46:04 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:23:16.769 13:46:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:16.769 13:46:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:16.769 13:46:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:16.769 ************************************ 00:23:16.769 START TEST raid_state_function_test_sb_md_interleaved 00:23:16.769 ************************************ 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=99839 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 99839' 00:23:16.769 Process raid pid: 99839 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 99839 /var/tmp/spdk-raid.sock 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 99839 ']' 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:16.769 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:16.769 13:46:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:16.769 [2024-07-15 13:46:04.253774] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:23:16.769 [2024-07-15 13:46:04.253825] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:16.769 [2024-07-15 13:46:04.342210] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:17.028 [2024-07-15 13:46:04.433683] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:17.028 [2024-07-15 13:46:04.498597] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:17.028 [2024-07-15 13:46:04.498630] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:17.592 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:17.592 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:23:17.592 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:17.592 [2024-07-15 13:46:05.206231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:17.593 [2024-07-15 13:46:05.206266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:17.593 [2024-07-15 13:46:05.206273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:17.593 [2024-07-15 13:46:05.206297] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:17.850 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.850 "name": "Existed_Raid", 00:23:17.850 "uuid": "ee277a3b-b4ae-4dc9-a177-70929f5b397d", 00:23:17.850 "strip_size_kb": 0, 00:23:17.850 "state": "configuring", 00:23:17.850 "raid_level": "raid1", 00:23:17.850 "superblock": true, 00:23:17.850 "num_base_bdevs": 2, 00:23:17.850 "num_base_bdevs_discovered": 0, 00:23:17.850 "num_base_bdevs_operational": 2, 00:23:17.850 "base_bdevs_list": [ 00:23:17.850 { 00:23:17.850 "name": "BaseBdev1", 00:23:17.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.850 "is_configured": false, 00:23:17.850 "data_offset": 0, 00:23:17.850 "data_size": 0 00:23:17.850 }, 00:23:17.850 { 00:23:17.850 "name": "BaseBdev2", 00:23:17.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.851 "is_configured": false, 00:23:17.851 "data_offset": 0, 00:23:17.851 "data_size": 0 00:23:17.851 } 00:23:17.851 ] 00:23:17.851 }' 00:23:17.851 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.851 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:18.415 13:46:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:18.415 [2024-07-15 13:46:06.028261] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:18.415 [2024-07-15 13:46:06.028287] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x945f30 name Existed_Raid, state configuring 00:23:18.672 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:18.672 [2024-07-15 13:46:06.204728] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:18.672 [2024-07-15 13:46:06.204754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:18.672 [2024-07-15 13:46:06.204760] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:18.672 [2024-07-15 13:46:06.204767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:18.672 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:23:18.929 [2024-07-15 13:46:06.398058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:18.929 BaseBdev1 00:23:18.929 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:18.929 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:18.929 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:18.929 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:23:18.929 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:18.929 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:18.929 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:19.187 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:19.187 [ 00:23:19.187 { 00:23:19.187 "name": "BaseBdev1", 00:23:19.187 "aliases": [ 00:23:19.187 "e318809a-e937-4bd6-b0ce-6df115216360" 00:23:19.187 ], 00:23:19.187 "product_name": "Malloc disk", 00:23:19.187 "block_size": 4128, 00:23:19.187 "num_blocks": 8192, 00:23:19.187 "uuid": "e318809a-e937-4bd6-b0ce-6df115216360", 00:23:19.187 "md_size": 32, 00:23:19.187 "md_interleave": true, 00:23:19.187 "dif_type": 0, 00:23:19.187 "assigned_rate_limits": { 00:23:19.187 "rw_ios_per_sec": 0, 00:23:19.187 "rw_mbytes_per_sec": 0, 00:23:19.187 "r_mbytes_per_sec": 0, 00:23:19.187 "w_mbytes_per_sec": 0 00:23:19.187 }, 00:23:19.187 "claimed": true, 00:23:19.187 "claim_type": "exclusive_write", 00:23:19.187 "zoned": false, 00:23:19.187 "supported_io_types": { 00:23:19.187 "read": true, 00:23:19.187 "write": true, 00:23:19.187 "unmap": true, 00:23:19.187 "flush": true, 00:23:19.187 "reset": true, 00:23:19.187 "nvme_admin": false, 00:23:19.187 "nvme_io": false, 00:23:19.187 "nvme_io_md": false, 00:23:19.187 "write_zeroes": true, 00:23:19.187 "zcopy": true, 00:23:19.187 "get_zone_info": false, 00:23:19.187 "zone_management": false, 00:23:19.187 "zone_append": false, 00:23:19.187 "compare": false, 00:23:19.187 "compare_and_write": false, 00:23:19.187 "abort": true, 00:23:19.187 "seek_hole": false, 00:23:19.187 "seek_data": false, 00:23:19.187 "copy": true, 00:23:19.187 "nvme_iov_md": false 00:23:19.187 }, 00:23:19.187 "memory_domains": [ 00:23:19.187 { 00:23:19.187 "dma_device_id": "system", 00:23:19.187 "dma_device_type": 1 00:23:19.187 }, 00:23:19.187 { 00:23:19.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.187 "dma_device_type": 2 00:23:19.187 } 00:23:19.187 ], 00:23:19.187 "driver_specific": {} 00:23:19.187 } 00:23:19.187 ] 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.188 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:19.446 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.446 "name": "Existed_Raid", 00:23:19.446 "uuid": "38e4e38c-3961-413b-b087-ff48ecd579d9", 00:23:19.446 "strip_size_kb": 0, 00:23:19.446 "state": "configuring", 00:23:19.446 "raid_level": "raid1", 00:23:19.446 "superblock": true, 00:23:19.446 "num_base_bdevs": 2, 00:23:19.446 "num_base_bdevs_discovered": 1, 00:23:19.446 "num_base_bdevs_operational": 2, 00:23:19.446 "base_bdevs_list": [ 00:23:19.446 { 00:23:19.446 "name": "BaseBdev1", 00:23:19.446 "uuid": "e318809a-e937-4bd6-b0ce-6df115216360", 00:23:19.446 "is_configured": true, 00:23:19.446 "data_offset": 256, 00:23:19.446 "data_size": 7936 00:23:19.446 }, 00:23:19.446 { 00:23:19.446 "name": "BaseBdev2", 00:23:19.446 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.446 "is_configured": false, 00:23:19.446 "data_offset": 0, 00:23:19.446 "data_size": 0 00:23:19.446 } 00:23:19.446 ] 00:23:19.446 }' 00:23:19.446 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.446 13:46:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:20.011 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:20.011 [2024-07-15 13:46:07.569108] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:20.011 [2024-07-15 13:46:07.569139] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x945820 name Existed_Raid, state configuring 00:23:20.011 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:20.268 [2024-07-15 13:46:07.741583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:20.268 [2024-07-15 13:46:07.742678] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:20.268 [2024-07-15 13:46:07.742703] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:20.268 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:20.268 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:20.268 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:20.268 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.269 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:20.526 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:20.526 "name": "Existed_Raid", 00:23:20.526 "uuid": "3a8c4c6b-a10d-4707-ac04-ff01e2faa19c", 00:23:20.526 "strip_size_kb": 0, 00:23:20.526 "state": "configuring", 00:23:20.526 "raid_level": "raid1", 00:23:20.526 "superblock": true, 00:23:20.526 "num_base_bdevs": 2, 00:23:20.526 "num_base_bdevs_discovered": 1, 00:23:20.526 "num_base_bdevs_operational": 2, 00:23:20.526 "base_bdevs_list": [ 00:23:20.526 { 00:23:20.526 "name": "BaseBdev1", 00:23:20.526 "uuid": "e318809a-e937-4bd6-b0ce-6df115216360", 00:23:20.526 "is_configured": true, 00:23:20.526 "data_offset": 256, 00:23:20.526 "data_size": 7936 00:23:20.526 }, 00:23:20.526 { 00:23:20.526 "name": "BaseBdev2", 00:23:20.526 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.526 "is_configured": false, 00:23:20.526 "data_offset": 0, 00:23:20.526 "data_size": 0 00:23:20.526 } 00:23:20.526 ] 00:23:20.526 }' 00:23:20.526 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:20.526 13:46:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:23:21.092 [2024-07-15 13:46:08.582829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:21.092 [2024-07-15 13:46:08.582939] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9476c0 00:23:21.092 [2024-07-15 13:46:08.582949] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:21.092 [2024-07-15 13:46:08.583004] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x944f20 00:23:21.092 [2024-07-15 13:46:08.583062] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9476c0 00:23:21.092 [2024-07-15 13:46:08.583069] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9476c0 00:23:21.092 [2024-07-15 13:46:08.583108] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.092 BaseBdev2 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:21.092 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:21.352 [ 00:23:21.352 { 00:23:21.352 "name": "BaseBdev2", 00:23:21.352 "aliases": [ 00:23:21.352 "170ed8d6-17bf-4891-ab0d-397e4fe91220" 00:23:21.352 ], 00:23:21.352 "product_name": "Malloc disk", 00:23:21.352 "block_size": 4128, 00:23:21.352 "num_blocks": 8192, 00:23:21.352 "uuid": "170ed8d6-17bf-4891-ab0d-397e4fe91220", 00:23:21.352 "md_size": 32, 00:23:21.352 "md_interleave": true, 00:23:21.352 "dif_type": 0, 00:23:21.352 "assigned_rate_limits": { 00:23:21.352 "rw_ios_per_sec": 0, 00:23:21.352 "rw_mbytes_per_sec": 0, 00:23:21.352 "r_mbytes_per_sec": 0, 00:23:21.352 "w_mbytes_per_sec": 0 00:23:21.352 }, 00:23:21.352 "claimed": true, 00:23:21.352 "claim_type": "exclusive_write", 00:23:21.352 "zoned": false, 00:23:21.352 "supported_io_types": { 00:23:21.352 "read": true, 00:23:21.352 "write": true, 00:23:21.352 "unmap": true, 00:23:21.352 "flush": true, 00:23:21.352 "reset": true, 00:23:21.352 "nvme_admin": false, 00:23:21.352 "nvme_io": false, 00:23:21.352 "nvme_io_md": false, 00:23:21.352 "write_zeroes": true, 00:23:21.352 "zcopy": true, 00:23:21.352 "get_zone_info": false, 00:23:21.352 "zone_management": false, 00:23:21.352 "zone_append": false, 00:23:21.352 "compare": false, 00:23:21.352 "compare_and_write": false, 00:23:21.352 "abort": true, 00:23:21.352 "seek_hole": false, 00:23:21.352 "seek_data": false, 00:23:21.352 "copy": true, 00:23:21.352 "nvme_iov_md": false 00:23:21.352 }, 00:23:21.352 "memory_domains": [ 00:23:21.352 { 00:23:21.352 "dma_device_id": "system", 00:23:21.352 "dma_device_type": 1 00:23:21.352 }, 00:23:21.352 { 00:23:21.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:21.352 "dma_device_type": 2 00:23:21.352 } 00:23:21.352 ], 00:23:21.352 "driver_specific": {} 00:23:21.352 } 00:23:21.352 ] 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.352 13:46:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:21.652 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.652 "name": "Existed_Raid", 00:23:21.652 "uuid": "3a8c4c6b-a10d-4707-ac04-ff01e2faa19c", 00:23:21.652 "strip_size_kb": 0, 00:23:21.652 "state": "online", 00:23:21.652 "raid_level": "raid1", 00:23:21.652 "superblock": true, 00:23:21.652 "num_base_bdevs": 2, 00:23:21.652 "num_base_bdevs_discovered": 2, 00:23:21.652 "num_base_bdevs_operational": 2, 00:23:21.652 "base_bdevs_list": [ 00:23:21.652 { 00:23:21.652 "name": "BaseBdev1", 00:23:21.652 "uuid": "e318809a-e937-4bd6-b0ce-6df115216360", 00:23:21.652 "is_configured": true, 00:23:21.652 "data_offset": 256, 00:23:21.652 "data_size": 7936 00:23:21.652 }, 00:23:21.652 { 00:23:21.652 "name": "BaseBdev2", 00:23:21.652 "uuid": "170ed8d6-17bf-4891-ab0d-397e4fe91220", 00:23:21.652 "is_configured": true, 00:23:21.652 "data_offset": 256, 00:23:21.652 "data_size": 7936 00:23:21.652 } 00:23:21.652 ] 00:23:21.652 }' 00:23:21.652 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.652 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:22.243 [2024-07-15 13:46:09.778146] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:22.243 "name": "Existed_Raid", 00:23:22.243 "aliases": [ 00:23:22.243 "3a8c4c6b-a10d-4707-ac04-ff01e2faa19c" 00:23:22.243 ], 00:23:22.243 "product_name": "Raid Volume", 00:23:22.243 "block_size": 4128, 00:23:22.243 "num_blocks": 7936, 00:23:22.243 "uuid": "3a8c4c6b-a10d-4707-ac04-ff01e2faa19c", 00:23:22.243 "md_size": 32, 00:23:22.243 "md_interleave": true, 00:23:22.243 "dif_type": 0, 00:23:22.243 "assigned_rate_limits": { 00:23:22.243 "rw_ios_per_sec": 0, 00:23:22.243 "rw_mbytes_per_sec": 0, 00:23:22.243 "r_mbytes_per_sec": 0, 00:23:22.243 "w_mbytes_per_sec": 0 00:23:22.243 }, 00:23:22.243 "claimed": false, 00:23:22.243 "zoned": false, 00:23:22.243 "supported_io_types": { 00:23:22.243 "read": true, 00:23:22.243 "write": true, 00:23:22.243 "unmap": false, 00:23:22.243 "flush": false, 00:23:22.243 "reset": true, 00:23:22.243 "nvme_admin": false, 00:23:22.243 "nvme_io": false, 00:23:22.243 "nvme_io_md": false, 00:23:22.243 "write_zeroes": true, 00:23:22.243 "zcopy": false, 00:23:22.243 "get_zone_info": false, 00:23:22.243 "zone_management": false, 00:23:22.243 "zone_append": false, 00:23:22.243 "compare": false, 00:23:22.243 "compare_and_write": false, 00:23:22.243 "abort": false, 00:23:22.243 "seek_hole": false, 00:23:22.243 "seek_data": false, 00:23:22.243 "copy": false, 00:23:22.243 "nvme_iov_md": false 00:23:22.243 }, 00:23:22.243 "memory_domains": [ 00:23:22.243 { 00:23:22.243 "dma_device_id": "system", 00:23:22.243 "dma_device_type": 1 00:23:22.243 }, 00:23:22.243 { 00:23:22.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.243 "dma_device_type": 2 00:23:22.243 }, 00:23:22.243 { 00:23:22.243 "dma_device_id": "system", 00:23:22.243 "dma_device_type": 1 00:23:22.243 }, 00:23:22.243 { 00:23:22.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.243 "dma_device_type": 2 00:23:22.243 } 00:23:22.243 ], 00:23:22.243 "driver_specific": { 00:23:22.243 "raid": { 00:23:22.243 "uuid": "3a8c4c6b-a10d-4707-ac04-ff01e2faa19c", 00:23:22.243 "strip_size_kb": 0, 00:23:22.243 "state": "online", 00:23:22.243 "raid_level": "raid1", 00:23:22.243 "superblock": true, 00:23:22.243 "num_base_bdevs": 2, 00:23:22.243 "num_base_bdevs_discovered": 2, 00:23:22.243 "num_base_bdevs_operational": 2, 00:23:22.243 "base_bdevs_list": [ 00:23:22.243 { 00:23:22.243 "name": "BaseBdev1", 00:23:22.243 "uuid": "e318809a-e937-4bd6-b0ce-6df115216360", 00:23:22.243 "is_configured": true, 00:23:22.243 "data_offset": 256, 00:23:22.243 "data_size": 7936 00:23:22.243 }, 00:23:22.243 { 00:23:22.243 "name": "BaseBdev2", 00:23:22.243 "uuid": "170ed8d6-17bf-4891-ab0d-397e4fe91220", 00:23:22.243 "is_configured": true, 00:23:22.243 "data_offset": 256, 00:23:22.243 "data_size": 7936 00:23:22.243 } 00:23:22.243 ] 00:23:22.243 } 00:23:22.243 } 00:23:22.243 }' 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:22.243 BaseBdev2' 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:22.243 13:46:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:22.501 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:22.501 "name": "BaseBdev1", 00:23:22.501 "aliases": [ 00:23:22.501 "e318809a-e937-4bd6-b0ce-6df115216360" 00:23:22.501 ], 00:23:22.501 "product_name": "Malloc disk", 00:23:22.501 "block_size": 4128, 00:23:22.501 "num_blocks": 8192, 00:23:22.501 "uuid": "e318809a-e937-4bd6-b0ce-6df115216360", 00:23:22.501 "md_size": 32, 00:23:22.501 "md_interleave": true, 00:23:22.501 "dif_type": 0, 00:23:22.501 "assigned_rate_limits": { 00:23:22.501 "rw_ios_per_sec": 0, 00:23:22.501 "rw_mbytes_per_sec": 0, 00:23:22.501 "r_mbytes_per_sec": 0, 00:23:22.501 "w_mbytes_per_sec": 0 00:23:22.501 }, 00:23:22.501 "claimed": true, 00:23:22.501 "claim_type": "exclusive_write", 00:23:22.501 "zoned": false, 00:23:22.501 "supported_io_types": { 00:23:22.501 "read": true, 00:23:22.501 "write": true, 00:23:22.501 "unmap": true, 00:23:22.501 "flush": true, 00:23:22.501 "reset": true, 00:23:22.501 "nvme_admin": false, 00:23:22.501 "nvme_io": false, 00:23:22.501 "nvme_io_md": false, 00:23:22.501 "write_zeroes": true, 00:23:22.501 "zcopy": true, 00:23:22.501 "get_zone_info": false, 00:23:22.501 "zone_management": false, 00:23:22.501 "zone_append": false, 00:23:22.501 "compare": false, 00:23:22.501 "compare_and_write": false, 00:23:22.501 "abort": true, 00:23:22.501 "seek_hole": false, 00:23:22.501 "seek_data": false, 00:23:22.501 "copy": true, 00:23:22.501 "nvme_iov_md": false 00:23:22.501 }, 00:23:22.501 "memory_domains": [ 00:23:22.501 { 00:23:22.501 "dma_device_id": "system", 00:23:22.501 "dma_device_type": 1 00:23:22.501 }, 00:23:22.501 { 00:23:22.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.501 "dma_device_type": 2 00:23:22.501 } 00:23:22.501 ], 00:23:22.501 "driver_specific": {} 00:23:22.501 }' 00:23:22.501 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:22.501 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:22.501 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:22.501 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:22.501 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:22.758 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:23.016 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:23.016 "name": "BaseBdev2", 00:23:23.016 "aliases": [ 00:23:23.016 "170ed8d6-17bf-4891-ab0d-397e4fe91220" 00:23:23.016 ], 00:23:23.016 "product_name": "Malloc disk", 00:23:23.016 "block_size": 4128, 00:23:23.016 "num_blocks": 8192, 00:23:23.016 "uuid": "170ed8d6-17bf-4891-ab0d-397e4fe91220", 00:23:23.016 "md_size": 32, 00:23:23.016 "md_interleave": true, 00:23:23.016 "dif_type": 0, 00:23:23.016 "assigned_rate_limits": { 00:23:23.016 "rw_ios_per_sec": 0, 00:23:23.016 "rw_mbytes_per_sec": 0, 00:23:23.016 "r_mbytes_per_sec": 0, 00:23:23.016 "w_mbytes_per_sec": 0 00:23:23.016 }, 00:23:23.016 "claimed": true, 00:23:23.016 "claim_type": "exclusive_write", 00:23:23.016 "zoned": false, 00:23:23.016 "supported_io_types": { 00:23:23.016 "read": true, 00:23:23.016 "write": true, 00:23:23.016 "unmap": true, 00:23:23.016 "flush": true, 00:23:23.016 "reset": true, 00:23:23.016 "nvme_admin": false, 00:23:23.016 "nvme_io": false, 00:23:23.016 "nvme_io_md": false, 00:23:23.016 "write_zeroes": true, 00:23:23.016 "zcopy": true, 00:23:23.016 "get_zone_info": false, 00:23:23.016 "zone_management": false, 00:23:23.016 "zone_append": false, 00:23:23.016 "compare": false, 00:23:23.016 "compare_and_write": false, 00:23:23.016 "abort": true, 00:23:23.016 "seek_hole": false, 00:23:23.016 "seek_data": false, 00:23:23.016 "copy": true, 00:23:23.016 "nvme_iov_md": false 00:23:23.016 }, 00:23:23.016 "memory_domains": [ 00:23:23.016 { 00:23:23.016 "dma_device_id": "system", 00:23:23.016 "dma_device_type": 1 00:23:23.016 }, 00:23:23.016 { 00:23:23.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:23.016 "dma_device_type": 2 00:23:23.016 } 00:23:23.016 ], 00:23:23.016 "driver_specific": {} 00:23:23.016 }' 00:23:23.016 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:23.016 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:23.017 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:23.017 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:23.017 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:23.274 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:23.531 [2024-07-15 13:46:10.945001] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.531 13:46:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:23.531 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.531 "name": "Existed_Raid", 00:23:23.531 "uuid": "3a8c4c6b-a10d-4707-ac04-ff01e2faa19c", 00:23:23.531 "strip_size_kb": 0, 00:23:23.531 "state": "online", 00:23:23.531 "raid_level": "raid1", 00:23:23.531 "superblock": true, 00:23:23.531 "num_base_bdevs": 2, 00:23:23.531 "num_base_bdevs_discovered": 1, 00:23:23.531 "num_base_bdevs_operational": 1, 00:23:23.531 "base_bdevs_list": [ 00:23:23.531 { 00:23:23.531 "name": null, 00:23:23.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:23.531 "is_configured": false, 00:23:23.531 "data_offset": 256, 00:23:23.531 "data_size": 7936 00:23:23.531 }, 00:23:23.531 { 00:23:23.531 "name": "BaseBdev2", 00:23:23.531 "uuid": "170ed8d6-17bf-4891-ab0d-397e4fe91220", 00:23:23.531 "is_configured": true, 00:23:23.531 "data_offset": 256, 00:23:23.531 "data_size": 7936 00:23:23.531 } 00:23:23.531 ] 00:23:23.531 }' 00:23:23.531 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.531 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:24.098 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:24.098 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:24.098 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.098 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:24.357 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:24.357 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:24.357 13:46:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:24.616 [2024-07-15 13:46:11.976460] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:24.616 [2024-07-15 13:46:11.976527] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:24.616 [2024-07-15 13:46:11.987086] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:24.616 [2024-07-15 13:46:11.987115] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:24.616 [2024-07-15 13:46:11.987123] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9476c0 name Existed_Raid, state offline 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 99839 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 99839 ']' 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 99839 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 99839 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 99839' 00:23:24.616 killing process with pid 99839 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 99839 00:23:24.616 [2024-07-15 13:46:12.209191] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:24.616 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 99839 00:23:24.616 [2024-07-15 13:46:12.209966] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:24.875 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:23:24.875 00:23:24.875 real 0m8.202s 00:23:24.875 user 0m14.361s 00:23:24.875 sys 0m1.691s 00:23:24.875 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:24.875 13:46:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:24.875 ************************************ 00:23:24.875 END TEST raid_state_function_test_sb_md_interleaved 00:23:24.875 ************************************ 00:23:24.875 13:46:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:24.875 13:46:12 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:23:24.875 13:46:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:23:24.875 13:46:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:24.875 13:46:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:24.875 ************************************ 00:23:24.875 START TEST raid_superblock_test_md_interleaved 00:23:24.875 ************************************ 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=101127 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 101127 /var/tmp/spdk-raid.sock 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 101127 ']' 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:24.875 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:24.875 13:46:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:25.133 [2024-07-15 13:46:12.509436] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:23:25.133 [2024-07-15 13:46:12.509481] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid101127 ] 00:23:25.133 [2024-07-15 13:46:12.592151] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:25.133 [2024-07-15 13:46:12.680110] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.133 [2024-07-15 13:46:12.735391] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:25.133 [2024-07-15 13:46:12.735423] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:26.066 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:26.066 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:23:26.066 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:26.066 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:23:26.067 malloc1 00:23:26.067 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:26.067 [2024-07-15 13:46:13.661003] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:26.067 [2024-07-15 13:46:13.661039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.067 [2024-07-15 13:46:13.661075] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x860bb0 00:23:26.067 [2024-07-15 13:46:13.661084] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.067 [2024-07-15 13:46:13.662138] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.067 [2024-07-15 13:46:13.662160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:26.067 pt1 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:23:26.325 malloc2 00:23:26.325 13:46:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:26.583 [2024-07-15 13:46:14.009942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:26.583 [2024-07-15 13:46:14.009978] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.583 [2024-07-15 13:46:14.009991] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9ee350 00:23:26.583 [2024-07-15 13:46:14.010004] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.583 [2024-07-15 13:46:14.010990] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.583 [2024-07-15 13:46:14.011017] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:26.583 pt2 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:26.583 [2024-07-15 13:46:14.174404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:26.583 [2024-07-15 13:46:14.175478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:26.583 [2024-07-15 13:46:14.175583] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9efcc0 00:23:26.583 [2024-07-15 13:46:14.175592] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:26.583 [2024-07-15 13:46:14.175641] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x85eda0 00:23:26.583 [2024-07-15 13:46:14.175698] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9efcc0 00:23:26.583 [2024-07-15 13:46:14.175704] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9efcc0 00:23:26.583 [2024-07-15 13:46:14.175742] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.583 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.841 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.841 "name": "raid_bdev1", 00:23:26.841 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:26.841 "strip_size_kb": 0, 00:23:26.841 "state": "online", 00:23:26.841 "raid_level": "raid1", 00:23:26.841 "superblock": true, 00:23:26.841 "num_base_bdevs": 2, 00:23:26.841 "num_base_bdevs_discovered": 2, 00:23:26.841 "num_base_bdevs_operational": 2, 00:23:26.841 "base_bdevs_list": [ 00:23:26.841 { 00:23:26.841 "name": "pt1", 00:23:26.841 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:26.841 "is_configured": true, 00:23:26.841 "data_offset": 256, 00:23:26.841 "data_size": 7936 00:23:26.841 }, 00:23:26.841 { 00:23:26.841 "name": "pt2", 00:23:26.841 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:26.841 "is_configured": true, 00:23:26.841 "data_offset": 256, 00:23:26.841 "data_size": 7936 00:23:26.841 } 00:23:26.841 ] 00:23:26.841 }' 00:23:26.841 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.841 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:27.406 13:46:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:27.406 [2024-07-15 13:46:15.016745] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:27.664 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:27.664 "name": "raid_bdev1", 00:23:27.664 "aliases": [ 00:23:27.664 "bad1fb6a-c641-4be8-8431-afed499f3bb9" 00:23:27.664 ], 00:23:27.664 "product_name": "Raid Volume", 00:23:27.664 "block_size": 4128, 00:23:27.664 "num_blocks": 7936, 00:23:27.664 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:27.664 "md_size": 32, 00:23:27.664 "md_interleave": true, 00:23:27.664 "dif_type": 0, 00:23:27.664 "assigned_rate_limits": { 00:23:27.664 "rw_ios_per_sec": 0, 00:23:27.664 "rw_mbytes_per_sec": 0, 00:23:27.664 "r_mbytes_per_sec": 0, 00:23:27.664 "w_mbytes_per_sec": 0 00:23:27.664 }, 00:23:27.664 "claimed": false, 00:23:27.664 "zoned": false, 00:23:27.664 "supported_io_types": { 00:23:27.664 "read": true, 00:23:27.664 "write": true, 00:23:27.664 "unmap": false, 00:23:27.664 "flush": false, 00:23:27.664 "reset": true, 00:23:27.664 "nvme_admin": false, 00:23:27.664 "nvme_io": false, 00:23:27.664 "nvme_io_md": false, 00:23:27.664 "write_zeroes": true, 00:23:27.664 "zcopy": false, 00:23:27.664 "get_zone_info": false, 00:23:27.664 "zone_management": false, 00:23:27.664 "zone_append": false, 00:23:27.664 "compare": false, 00:23:27.664 "compare_and_write": false, 00:23:27.664 "abort": false, 00:23:27.664 "seek_hole": false, 00:23:27.664 "seek_data": false, 00:23:27.664 "copy": false, 00:23:27.664 "nvme_iov_md": false 00:23:27.664 }, 00:23:27.664 "memory_domains": [ 00:23:27.664 { 00:23:27.664 "dma_device_id": "system", 00:23:27.664 "dma_device_type": 1 00:23:27.664 }, 00:23:27.664 { 00:23:27.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:27.664 "dma_device_type": 2 00:23:27.664 }, 00:23:27.664 { 00:23:27.664 "dma_device_id": "system", 00:23:27.664 "dma_device_type": 1 00:23:27.664 }, 00:23:27.664 { 00:23:27.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:27.664 "dma_device_type": 2 00:23:27.664 } 00:23:27.664 ], 00:23:27.664 "driver_specific": { 00:23:27.664 "raid": { 00:23:27.664 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:27.664 "strip_size_kb": 0, 00:23:27.664 "state": "online", 00:23:27.664 "raid_level": "raid1", 00:23:27.664 "superblock": true, 00:23:27.664 "num_base_bdevs": 2, 00:23:27.664 "num_base_bdevs_discovered": 2, 00:23:27.664 "num_base_bdevs_operational": 2, 00:23:27.664 "base_bdevs_list": [ 00:23:27.664 { 00:23:27.664 "name": "pt1", 00:23:27.664 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:27.664 "is_configured": true, 00:23:27.664 "data_offset": 256, 00:23:27.664 "data_size": 7936 00:23:27.664 }, 00:23:27.664 { 00:23:27.664 "name": "pt2", 00:23:27.664 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:27.664 "is_configured": true, 00:23:27.664 "data_offset": 256, 00:23:27.664 "data_size": 7936 00:23:27.664 } 00:23:27.664 ] 00:23:27.664 } 00:23:27.664 } 00:23:27.664 }' 00:23:27.664 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:27.664 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:27.664 pt2' 00:23:27.664 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:27.664 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:27.664 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:27.664 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:27.664 "name": "pt1", 00:23:27.664 "aliases": [ 00:23:27.664 "00000000-0000-0000-0000-000000000001" 00:23:27.664 ], 00:23:27.664 "product_name": "passthru", 00:23:27.664 "block_size": 4128, 00:23:27.664 "num_blocks": 8192, 00:23:27.664 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:27.664 "md_size": 32, 00:23:27.664 "md_interleave": true, 00:23:27.664 "dif_type": 0, 00:23:27.664 "assigned_rate_limits": { 00:23:27.664 "rw_ios_per_sec": 0, 00:23:27.664 "rw_mbytes_per_sec": 0, 00:23:27.664 "r_mbytes_per_sec": 0, 00:23:27.664 "w_mbytes_per_sec": 0 00:23:27.664 }, 00:23:27.664 "claimed": true, 00:23:27.664 "claim_type": "exclusive_write", 00:23:27.664 "zoned": false, 00:23:27.664 "supported_io_types": { 00:23:27.664 "read": true, 00:23:27.664 "write": true, 00:23:27.664 "unmap": true, 00:23:27.664 "flush": true, 00:23:27.664 "reset": true, 00:23:27.664 "nvme_admin": false, 00:23:27.664 "nvme_io": false, 00:23:27.664 "nvme_io_md": false, 00:23:27.664 "write_zeroes": true, 00:23:27.664 "zcopy": true, 00:23:27.664 "get_zone_info": false, 00:23:27.664 "zone_management": false, 00:23:27.664 "zone_append": false, 00:23:27.664 "compare": false, 00:23:27.664 "compare_and_write": false, 00:23:27.664 "abort": true, 00:23:27.664 "seek_hole": false, 00:23:27.664 "seek_data": false, 00:23:27.664 "copy": true, 00:23:27.664 "nvme_iov_md": false 00:23:27.664 }, 00:23:27.664 "memory_domains": [ 00:23:27.664 { 00:23:27.664 "dma_device_id": "system", 00:23:27.664 "dma_device_type": 1 00:23:27.664 }, 00:23:27.664 { 00:23:27.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:27.664 "dma_device_type": 2 00:23:27.664 } 00:23:27.665 ], 00:23:27.665 "driver_specific": { 00:23:27.665 "passthru": { 00:23:27.665 "name": "pt1", 00:23:27.665 "base_bdev_name": "malloc1" 00:23:27.665 } 00:23:27.665 } 00:23:27.665 }' 00:23:27.665 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:27.922 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:28.179 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:28.179 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:28.179 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:28.179 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:28.179 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:28.179 "name": "pt2", 00:23:28.179 "aliases": [ 00:23:28.179 "00000000-0000-0000-0000-000000000002" 00:23:28.179 ], 00:23:28.179 "product_name": "passthru", 00:23:28.179 "block_size": 4128, 00:23:28.179 "num_blocks": 8192, 00:23:28.179 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:28.179 "md_size": 32, 00:23:28.179 "md_interleave": true, 00:23:28.179 "dif_type": 0, 00:23:28.179 "assigned_rate_limits": { 00:23:28.179 "rw_ios_per_sec": 0, 00:23:28.179 "rw_mbytes_per_sec": 0, 00:23:28.179 "r_mbytes_per_sec": 0, 00:23:28.179 "w_mbytes_per_sec": 0 00:23:28.179 }, 00:23:28.179 "claimed": true, 00:23:28.179 "claim_type": "exclusive_write", 00:23:28.179 "zoned": false, 00:23:28.179 "supported_io_types": { 00:23:28.179 "read": true, 00:23:28.179 "write": true, 00:23:28.179 "unmap": true, 00:23:28.179 "flush": true, 00:23:28.179 "reset": true, 00:23:28.179 "nvme_admin": false, 00:23:28.179 "nvme_io": false, 00:23:28.179 "nvme_io_md": false, 00:23:28.179 "write_zeroes": true, 00:23:28.179 "zcopy": true, 00:23:28.179 "get_zone_info": false, 00:23:28.179 "zone_management": false, 00:23:28.179 "zone_append": false, 00:23:28.179 "compare": false, 00:23:28.179 "compare_and_write": false, 00:23:28.179 "abort": true, 00:23:28.179 "seek_hole": false, 00:23:28.179 "seek_data": false, 00:23:28.179 "copy": true, 00:23:28.179 "nvme_iov_md": false 00:23:28.179 }, 00:23:28.179 "memory_domains": [ 00:23:28.179 { 00:23:28.179 "dma_device_id": "system", 00:23:28.179 "dma_device_type": 1 00:23:28.179 }, 00:23:28.179 { 00:23:28.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:28.179 "dma_device_type": 2 00:23:28.179 } 00:23:28.179 ], 00:23:28.179 "driver_specific": { 00:23:28.179 "passthru": { 00:23:28.179 "name": "pt2", 00:23:28.179 "base_bdev_name": "malloc2" 00:23:28.179 } 00:23:28.179 } 00:23:28.179 }' 00:23:28.179 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:28.179 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:28.437 13:46:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:28.437 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:28.437 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:28.437 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:28.437 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:28.695 [2024-07-15 13:46:16.195807] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:28.695 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=bad1fb6a-c641-4be8-8431-afed499f3bb9 00:23:28.695 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z bad1fb6a-c641-4be8-8431-afed499f3bb9 ']' 00:23:28.695 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:28.953 [2024-07-15 13:46:16.364059] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:28.953 [2024-07-15 13:46:16.364075] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:28.953 [2024-07-15 13:46:16.364121] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:28.953 [2024-07-15 13:46:16.364156] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:28.953 [2024-07-15 13:46:16.364164] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9efcc0 name raid_bdev1, state offline 00:23:28.953 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.953 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:28.953 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:28.953 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:28.953 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:28.953 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:29.213 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:29.213 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:29.472 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:29.472 13:46:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:29.472 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:29.730 [2024-07-15 13:46:17.214232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:29.730 [2024-07-15 13:46:17.215241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:29.730 [2024-07-15 13:46:17.215284] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:29.730 [2024-07-15 13:46:17.215315] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:29.730 [2024-07-15 13:46:17.215346] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:29.730 [2024-07-15 13:46:17.215354] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9ed200 name raid_bdev1, state configuring 00:23:29.730 request: 00:23:29.730 { 00:23:29.730 "name": "raid_bdev1", 00:23:29.730 "raid_level": "raid1", 00:23:29.730 "base_bdevs": [ 00:23:29.730 "malloc1", 00:23:29.730 "malloc2" 00:23:29.730 ], 00:23:29.730 "superblock": false, 00:23:29.730 "method": "bdev_raid_create", 00:23:29.730 "req_id": 1 00:23:29.730 } 00:23:29.730 Got JSON-RPC error response 00:23:29.730 response: 00:23:29.730 { 00:23:29.730 "code": -17, 00:23:29.730 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:29.730 } 00:23:29.730 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:23:29.730 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:29.730 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:29.730 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:29.730 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.730 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:29.988 [2024-07-15 13:46:17.559091] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:29.988 [2024-07-15 13:46:17.559143] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.988 [2024-07-15 13:46:17.559174] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x85eba0 00:23:29.988 [2024-07-15 13:46:17.559183] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.988 [2024-07-15 13:46:17.560232] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.988 [2024-07-15 13:46:17.560269] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:29.988 [2024-07-15 13:46:17.560306] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:29.988 [2024-07-15 13:46:17.560327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:29.988 pt1 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.988 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.245 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.245 "name": "raid_bdev1", 00:23:30.245 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:30.245 "strip_size_kb": 0, 00:23:30.245 "state": "configuring", 00:23:30.245 "raid_level": "raid1", 00:23:30.245 "superblock": true, 00:23:30.245 "num_base_bdevs": 2, 00:23:30.245 "num_base_bdevs_discovered": 1, 00:23:30.245 "num_base_bdevs_operational": 2, 00:23:30.245 "base_bdevs_list": [ 00:23:30.245 { 00:23:30.245 "name": "pt1", 00:23:30.245 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:30.245 "is_configured": true, 00:23:30.245 "data_offset": 256, 00:23:30.245 "data_size": 7936 00:23:30.245 }, 00:23:30.245 { 00:23:30.245 "name": null, 00:23:30.245 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:30.245 "is_configured": false, 00:23:30.245 "data_offset": 256, 00:23:30.245 "data_size": 7936 00:23:30.245 } 00:23:30.245 ] 00:23:30.245 }' 00:23:30.245 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.245 13:46:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:30.810 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:23:30.810 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:30.810 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:30.810 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:30.810 [2024-07-15 13:46:18.377187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:30.810 [2024-07-15 13:46:18.377226] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.810 [2024-07-15 13:46:18.377240] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f2fd0 00:23:30.811 [2024-07-15 13:46:18.377249] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.811 [2024-07-15 13:46:18.377385] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.811 [2024-07-15 13:46:18.377396] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:30.811 [2024-07-15 13:46:18.377429] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:30.811 [2024-07-15 13:46:18.377443] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:30.811 [2024-07-15 13:46:18.377508] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9f16e0 00:23:30.811 [2024-07-15 13:46:18.377515] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:30.811 [2024-07-15 13:46:18.377553] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9f26e0 00:23:30.811 [2024-07-15 13:46:18.377605] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9f16e0 00:23:30.811 [2024-07-15 13:46:18.377611] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9f16e0 00:23:30.811 [2024-07-15 13:46:18.377649] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.811 pt2 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.811 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.072 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.072 "name": "raid_bdev1", 00:23:31.072 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:31.072 "strip_size_kb": 0, 00:23:31.072 "state": "online", 00:23:31.072 "raid_level": "raid1", 00:23:31.072 "superblock": true, 00:23:31.072 "num_base_bdevs": 2, 00:23:31.072 "num_base_bdevs_discovered": 2, 00:23:31.072 "num_base_bdevs_operational": 2, 00:23:31.072 "base_bdevs_list": [ 00:23:31.072 { 00:23:31.072 "name": "pt1", 00:23:31.072 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:31.072 "is_configured": true, 00:23:31.072 "data_offset": 256, 00:23:31.072 "data_size": 7936 00:23:31.072 }, 00:23:31.072 { 00:23:31.072 "name": "pt2", 00:23:31.072 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:31.072 "is_configured": true, 00:23:31.072 "data_offset": 256, 00:23:31.072 "data_size": 7936 00:23:31.072 } 00:23:31.072 ] 00:23:31.072 }' 00:23:31.072 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.072 13:46:18 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:31.636 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:31.636 [2024-07-15 13:46:19.239583] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:31.893 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:31.893 "name": "raid_bdev1", 00:23:31.893 "aliases": [ 00:23:31.893 "bad1fb6a-c641-4be8-8431-afed499f3bb9" 00:23:31.893 ], 00:23:31.893 "product_name": "Raid Volume", 00:23:31.893 "block_size": 4128, 00:23:31.893 "num_blocks": 7936, 00:23:31.893 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:31.893 "md_size": 32, 00:23:31.893 "md_interleave": true, 00:23:31.893 "dif_type": 0, 00:23:31.893 "assigned_rate_limits": { 00:23:31.893 "rw_ios_per_sec": 0, 00:23:31.893 "rw_mbytes_per_sec": 0, 00:23:31.893 "r_mbytes_per_sec": 0, 00:23:31.893 "w_mbytes_per_sec": 0 00:23:31.893 }, 00:23:31.893 "claimed": false, 00:23:31.893 "zoned": false, 00:23:31.893 "supported_io_types": { 00:23:31.893 "read": true, 00:23:31.893 "write": true, 00:23:31.893 "unmap": false, 00:23:31.893 "flush": false, 00:23:31.893 "reset": true, 00:23:31.893 "nvme_admin": false, 00:23:31.893 "nvme_io": false, 00:23:31.893 "nvme_io_md": false, 00:23:31.893 "write_zeroes": true, 00:23:31.893 "zcopy": false, 00:23:31.893 "get_zone_info": false, 00:23:31.893 "zone_management": false, 00:23:31.893 "zone_append": false, 00:23:31.893 "compare": false, 00:23:31.893 "compare_and_write": false, 00:23:31.893 "abort": false, 00:23:31.893 "seek_hole": false, 00:23:31.893 "seek_data": false, 00:23:31.893 "copy": false, 00:23:31.893 "nvme_iov_md": false 00:23:31.893 }, 00:23:31.893 "memory_domains": [ 00:23:31.893 { 00:23:31.893 "dma_device_id": "system", 00:23:31.893 "dma_device_type": 1 00:23:31.893 }, 00:23:31.893 { 00:23:31.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.893 "dma_device_type": 2 00:23:31.893 }, 00:23:31.893 { 00:23:31.893 "dma_device_id": "system", 00:23:31.893 "dma_device_type": 1 00:23:31.893 }, 00:23:31.893 { 00:23:31.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.893 "dma_device_type": 2 00:23:31.893 } 00:23:31.893 ], 00:23:31.893 "driver_specific": { 00:23:31.893 "raid": { 00:23:31.893 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:31.893 "strip_size_kb": 0, 00:23:31.893 "state": "online", 00:23:31.893 "raid_level": "raid1", 00:23:31.893 "superblock": true, 00:23:31.893 "num_base_bdevs": 2, 00:23:31.893 "num_base_bdevs_discovered": 2, 00:23:31.893 "num_base_bdevs_operational": 2, 00:23:31.893 "base_bdevs_list": [ 00:23:31.893 { 00:23:31.893 "name": "pt1", 00:23:31.893 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:31.893 "is_configured": true, 00:23:31.893 "data_offset": 256, 00:23:31.893 "data_size": 7936 00:23:31.893 }, 00:23:31.893 { 00:23:31.893 "name": "pt2", 00:23:31.893 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:31.894 "is_configured": true, 00:23:31.894 "data_offset": 256, 00:23:31.894 "data_size": 7936 00:23:31.894 } 00:23:31.894 ] 00:23:31.894 } 00:23:31.894 } 00:23:31.894 }' 00:23:31.894 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:31.894 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:31.894 pt2' 00:23:31.894 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:31.894 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:31.894 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:31.894 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:31.894 "name": "pt1", 00:23:31.894 "aliases": [ 00:23:31.894 "00000000-0000-0000-0000-000000000001" 00:23:31.894 ], 00:23:31.894 "product_name": "passthru", 00:23:31.894 "block_size": 4128, 00:23:31.894 "num_blocks": 8192, 00:23:31.894 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:31.894 "md_size": 32, 00:23:31.894 "md_interleave": true, 00:23:31.894 "dif_type": 0, 00:23:31.894 "assigned_rate_limits": { 00:23:31.894 "rw_ios_per_sec": 0, 00:23:31.894 "rw_mbytes_per_sec": 0, 00:23:31.894 "r_mbytes_per_sec": 0, 00:23:31.894 "w_mbytes_per_sec": 0 00:23:31.894 }, 00:23:31.894 "claimed": true, 00:23:31.894 "claim_type": "exclusive_write", 00:23:31.894 "zoned": false, 00:23:31.894 "supported_io_types": { 00:23:31.894 "read": true, 00:23:31.894 "write": true, 00:23:31.894 "unmap": true, 00:23:31.894 "flush": true, 00:23:31.894 "reset": true, 00:23:31.894 "nvme_admin": false, 00:23:31.894 "nvme_io": false, 00:23:31.894 "nvme_io_md": false, 00:23:31.894 "write_zeroes": true, 00:23:31.894 "zcopy": true, 00:23:31.894 "get_zone_info": false, 00:23:31.894 "zone_management": false, 00:23:31.894 "zone_append": false, 00:23:31.894 "compare": false, 00:23:31.894 "compare_and_write": false, 00:23:31.894 "abort": true, 00:23:31.894 "seek_hole": false, 00:23:31.894 "seek_data": false, 00:23:31.894 "copy": true, 00:23:31.894 "nvme_iov_md": false 00:23:31.894 }, 00:23:31.894 "memory_domains": [ 00:23:31.894 { 00:23:31.894 "dma_device_id": "system", 00:23:31.894 "dma_device_type": 1 00:23:31.894 }, 00:23:31.894 { 00:23:31.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.894 "dma_device_type": 2 00:23:31.894 } 00:23:31.894 ], 00:23:31.894 "driver_specific": { 00:23:31.894 "passthru": { 00:23:31.894 "name": "pt1", 00:23:31.894 "base_bdev_name": "malloc1" 00:23:31.894 } 00:23:31.894 } 00:23:31.894 }' 00:23:31.894 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.151 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.151 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.152 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.410 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:32.410 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:32.410 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:32.410 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:32.410 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:32.410 "name": "pt2", 00:23:32.410 "aliases": [ 00:23:32.410 "00000000-0000-0000-0000-000000000002" 00:23:32.410 ], 00:23:32.410 "product_name": "passthru", 00:23:32.410 "block_size": 4128, 00:23:32.410 "num_blocks": 8192, 00:23:32.410 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:32.410 "md_size": 32, 00:23:32.410 "md_interleave": true, 00:23:32.410 "dif_type": 0, 00:23:32.410 "assigned_rate_limits": { 00:23:32.410 "rw_ios_per_sec": 0, 00:23:32.410 "rw_mbytes_per_sec": 0, 00:23:32.410 "r_mbytes_per_sec": 0, 00:23:32.410 "w_mbytes_per_sec": 0 00:23:32.410 }, 00:23:32.410 "claimed": true, 00:23:32.410 "claim_type": "exclusive_write", 00:23:32.410 "zoned": false, 00:23:32.410 "supported_io_types": { 00:23:32.410 "read": true, 00:23:32.410 "write": true, 00:23:32.410 "unmap": true, 00:23:32.410 "flush": true, 00:23:32.410 "reset": true, 00:23:32.410 "nvme_admin": false, 00:23:32.410 "nvme_io": false, 00:23:32.410 "nvme_io_md": false, 00:23:32.410 "write_zeroes": true, 00:23:32.410 "zcopy": true, 00:23:32.410 "get_zone_info": false, 00:23:32.410 "zone_management": false, 00:23:32.410 "zone_append": false, 00:23:32.410 "compare": false, 00:23:32.410 "compare_and_write": false, 00:23:32.410 "abort": true, 00:23:32.410 "seek_hole": false, 00:23:32.410 "seek_data": false, 00:23:32.410 "copy": true, 00:23:32.410 "nvme_iov_md": false 00:23:32.410 }, 00:23:32.410 "memory_domains": [ 00:23:32.410 { 00:23:32.410 "dma_device_id": "system", 00:23:32.410 "dma_device_type": 1 00:23:32.410 }, 00:23:32.410 { 00:23:32.410 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:32.410 "dma_device_type": 2 00:23:32.410 } 00:23:32.410 ], 00:23:32.410 "driver_specific": { 00:23:32.410 "passthru": { 00:23:32.410 "name": "pt2", 00:23:32.410 "base_bdev_name": "malloc2" 00:23:32.410 } 00:23:32.410 } 00:23:32.410 }' 00:23:32.410 13:46:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.668 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:32.928 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:32.928 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:32.928 [2024-07-15 13:46:20.446787] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:32.928 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' bad1fb6a-c641-4be8-8431-afed499f3bb9 '!=' bad1fb6a-c641-4be8-8431-afed499f3bb9 ']' 00:23:32.928 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:32.928 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:32.928 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:23:32.928 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:33.186 [2024-07-15 13:46:20.623094] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.186 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.445 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:33.445 "name": "raid_bdev1", 00:23:33.445 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:33.445 "strip_size_kb": 0, 00:23:33.445 "state": "online", 00:23:33.445 "raid_level": "raid1", 00:23:33.445 "superblock": true, 00:23:33.445 "num_base_bdevs": 2, 00:23:33.445 "num_base_bdevs_discovered": 1, 00:23:33.445 "num_base_bdevs_operational": 1, 00:23:33.445 "base_bdevs_list": [ 00:23:33.445 { 00:23:33.445 "name": null, 00:23:33.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.445 "is_configured": false, 00:23:33.445 "data_offset": 256, 00:23:33.445 "data_size": 7936 00:23:33.445 }, 00:23:33.445 { 00:23:33.445 "name": "pt2", 00:23:33.445 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:33.445 "is_configured": true, 00:23:33.445 "data_offset": 256, 00:23:33.445 "data_size": 7936 00:23:33.445 } 00:23:33.445 ] 00:23:33.445 }' 00:23:33.445 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:33.445 13:46:20 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:34.013 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:34.013 [2024-07-15 13:46:21.493312] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:34.013 [2024-07-15 13:46:21.493336] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:34.013 [2024-07-15 13:46:21.493374] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:34.013 [2024-07-15 13:46:21.493404] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:34.014 [2024-07-15 13:46:21.493412] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9f16e0 name raid_bdev1, state offline 00:23:34.014 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.014 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:23:34.273 13:46:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:34.532 [2024-07-15 13:46:22.002619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:34.532 [2024-07-15 13:46:22.002655] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.532 [2024-07-15 13:46:22.002683] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f3380 00:23:34.532 [2024-07-15 13:46:22.002692] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.532 [2024-07-15 13:46:22.003745] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.532 [2024-07-15 13:46:22.003766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:34.532 [2024-07-15 13:46:22.003800] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:34.532 [2024-07-15 13:46:22.003821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:34.532 [2024-07-15 13:46:22.003875] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9f1e00 00:23:34.532 [2024-07-15 13:46:22.003882] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:34.532 [2024-07-15 13:46:22.003924] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9eeef0 00:23:34.532 [2024-07-15 13:46:22.003976] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9f1e00 00:23:34.532 [2024-07-15 13:46:22.003983] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9f1e00 00:23:34.532 [2024-07-15 13:46:22.004031] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:34.532 pt2 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.532 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.791 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.791 "name": "raid_bdev1", 00:23:34.791 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:34.791 "strip_size_kb": 0, 00:23:34.791 "state": "online", 00:23:34.791 "raid_level": "raid1", 00:23:34.791 "superblock": true, 00:23:34.791 "num_base_bdevs": 2, 00:23:34.791 "num_base_bdevs_discovered": 1, 00:23:34.791 "num_base_bdevs_operational": 1, 00:23:34.791 "base_bdevs_list": [ 00:23:34.791 { 00:23:34.791 "name": null, 00:23:34.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.791 "is_configured": false, 00:23:34.791 "data_offset": 256, 00:23:34.791 "data_size": 7936 00:23:34.791 }, 00:23:34.791 { 00:23:34.791 "name": "pt2", 00:23:34.791 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:34.791 "is_configured": true, 00:23:34.791 "data_offset": 256, 00:23:34.791 "data_size": 7936 00:23:34.791 } 00:23:34.791 ] 00:23:34.791 }' 00:23:34.791 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.791 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:35.359 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:35.359 [2024-07-15 13:46:22.856821] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:35.359 [2024-07-15 13:46:22.856842] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:35.359 [2024-07-15 13:46:22.856881] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:35.359 [2024-07-15 13:46:22.856910] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:35.359 [2024-07-15 13:46:22.856918] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9f1e00 name raid_bdev1, state offline 00:23:35.359 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.359 13:46:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:35.618 [2024-07-15 13:46:23.209712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:35.618 [2024-07-15 13:46:23.209753] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:35.618 [2024-07-15 13:46:23.209781] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9eed70 00:23:35.618 [2024-07-15 13:46:23.209790] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:35.618 [2024-07-15 13:46:23.210831] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:35.618 [2024-07-15 13:46:23.210853] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:35.618 [2024-07-15 13:46:23.210888] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:35.618 [2024-07-15 13:46:23.210909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:35.618 [2024-07-15 13:46:23.210968] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:35.618 [2024-07-15 13:46:23.210977] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:35.618 [2024-07-15 13:46:23.210988] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9f2a10 name raid_bdev1, state configuring 00:23:35.618 [2024-07-15 13:46:23.211017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:35.618 [2024-07-15 13:46:23.211058] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x860de0 00:23:35.618 [2024-07-15 13:46:23.211066] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:35.618 [2024-07-15 13:46:23.211105] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9f2b90 00:23:35.618 [2024-07-15 13:46:23.211154] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x860de0 00:23:35.618 [2024-07-15 13:46:23.211161] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x860de0 00:23:35.618 [2024-07-15 13:46:23.211201] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:35.618 pt1 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:35.618 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.877 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.877 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:35.877 "name": "raid_bdev1", 00:23:35.877 "uuid": "bad1fb6a-c641-4be8-8431-afed499f3bb9", 00:23:35.877 "strip_size_kb": 0, 00:23:35.877 "state": "online", 00:23:35.877 "raid_level": "raid1", 00:23:35.877 "superblock": true, 00:23:35.877 "num_base_bdevs": 2, 00:23:35.877 "num_base_bdevs_discovered": 1, 00:23:35.877 "num_base_bdevs_operational": 1, 00:23:35.877 "base_bdevs_list": [ 00:23:35.877 { 00:23:35.877 "name": null, 00:23:35.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.877 "is_configured": false, 00:23:35.877 "data_offset": 256, 00:23:35.877 "data_size": 7936 00:23:35.877 }, 00:23:35.877 { 00:23:35.877 "name": "pt2", 00:23:35.877 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:35.877 "is_configured": true, 00:23:35.877 "data_offset": 256, 00:23:35.877 "data_size": 7936 00:23:35.877 } 00:23:35.877 ] 00:23:35.877 }' 00:23:35.877 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:35.877 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:36.446 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:36.446 13:46:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:36.705 [2024-07-15 13:46:24.236519] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' bad1fb6a-c641-4be8-8431-afed499f3bb9 '!=' bad1fb6a-c641-4be8-8431-afed499f3bb9 ']' 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 101127 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 101127 ']' 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 101127 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 101127 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 101127' 00:23:36.705 killing process with pid 101127 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 101127 00:23:36.705 [2024-07-15 13:46:24.284347] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:36.705 [2024-07-15 13:46:24.284393] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:36.705 [2024-07-15 13:46:24.284425] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:36.705 [2024-07-15 13:46:24.284434] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x860de0 name raid_bdev1, state offline 00:23:36.705 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 101127 00:23:36.705 [2024-07-15 13:46:24.302831] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:36.964 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:23:36.964 00:23:36.964 real 0m12.035s 00:23:36.964 user 0m21.635s 00:23:36.964 sys 0m2.356s 00:23:36.964 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:36.964 13:46:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:36.964 ************************************ 00:23:36.964 END TEST raid_superblock_test_md_interleaved 00:23:36.964 ************************************ 00:23:36.964 13:46:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:36.964 13:46:24 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:23:36.964 13:46:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:36.964 13:46:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:36.964 13:46:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:36.964 ************************************ 00:23:36.964 START TEST raid_rebuild_test_sb_md_interleaved 00:23:36.964 ************************************ 00:23:36.964 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=103021 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 103021 /var/tmp/spdk-raid.sock 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 103021 ']' 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:37.223 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:37.223 13:46:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:37.223 [2024-07-15 13:46:24.641907] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:23:37.223 [2024-07-15 13:46:24.641959] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid103021 ] 00:23:37.223 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:37.223 Zero copy mechanism will not be used. 00:23:37.223 [2024-07-15 13:46:24.726199] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:37.223 [2024-07-15 13:46:24.814759] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:37.481 [2024-07-15 13:46:24.868727] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:37.481 [2024-07-15 13:46:24.868754] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:38.070 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:38.070 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:23:38.070 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:38.070 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:23:38.070 BaseBdev1_malloc 00:23:38.070 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:38.328 [2024-07-15 13:46:25.783708] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:38.328 [2024-07-15 13:46:25.783746] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:38.328 [2024-07-15 13:46:25.783764] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20565a0 00:23:38.328 [2024-07-15 13:46:25.783778] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:38.328 [2024-07-15 13:46:25.784934] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:38.328 [2024-07-15 13:46:25.784956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:38.328 BaseBdev1 00:23:38.328 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:38.328 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:23:38.587 BaseBdev2_malloc 00:23:38.587 13:46:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:38.587 [2024-07-15 13:46:26.134112] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:38.587 [2024-07-15 13:46:26.134148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:38.587 [2024-07-15 13:46:26.134181] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x204db90 00:23:38.587 [2024-07-15 13:46:26.134190] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:38.587 [2024-07-15 13:46:26.135537] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:38.587 [2024-07-15 13:46:26.135560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:38.587 BaseBdev2 00:23:38.587 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:23:38.844 spare_malloc 00:23:38.844 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:39.102 spare_delay 00:23:39.102 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:39.102 [2024-07-15 13:46:26.656654] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:39.102 [2024-07-15 13:46:26.656696] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.102 [2024-07-15 13:46:26.656715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2050930 00:23:39.102 [2024-07-15 13:46:26.656724] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.102 [2024-07-15 13:46:26.657826] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.102 [2024-07-15 13:46:26.657848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:39.102 spare 00:23:39.102 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:39.360 [2024-07-15 13:46:26.817106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:39.360 [2024-07-15 13:46:26.818158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:39.360 [2024-07-15 13:46:26.818290] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2052c30 00:23:39.360 [2024-07-15 13:46:26.818299] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:39.360 [2024-07-15 13:46:26.818355] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eb9280 00:23:39.360 [2024-07-15 13:46:26.818416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2052c30 00:23:39.360 [2024-07-15 13:46:26.818423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2052c30 00:23:39.360 [2024-07-15 13:46:26.818469] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.360 13:46:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.619 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.619 "name": "raid_bdev1", 00:23:39.619 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:39.619 "strip_size_kb": 0, 00:23:39.619 "state": "online", 00:23:39.619 "raid_level": "raid1", 00:23:39.619 "superblock": true, 00:23:39.619 "num_base_bdevs": 2, 00:23:39.619 "num_base_bdevs_discovered": 2, 00:23:39.619 "num_base_bdevs_operational": 2, 00:23:39.619 "base_bdevs_list": [ 00:23:39.619 { 00:23:39.619 "name": "BaseBdev1", 00:23:39.619 "uuid": "4c59d81f-a093-542c-81cf-c6631ae08195", 00:23:39.619 "is_configured": true, 00:23:39.619 "data_offset": 256, 00:23:39.619 "data_size": 7936 00:23:39.619 }, 00:23:39.619 { 00:23:39.619 "name": "BaseBdev2", 00:23:39.619 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:39.619 "is_configured": true, 00:23:39.619 "data_offset": 256, 00:23:39.619 "data_size": 7936 00:23:39.619 } 00:23:39.619 ] 00:23:39.619 }' 00:23:39.619 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.619 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:39.940 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:39.940 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:40.198 [2024-07-15 13:46:27.655392] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:40.198 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:23:40.198 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.198 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:40.455 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:23:40.455 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:40.455 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:23:40.455 13:46:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:40.455 [2024-07-15 13:46:28.004124] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:40.455 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:40.455 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.455 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.455 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.455 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.456 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:40.456 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.456 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.456 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.456 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.456 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.456 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.713 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.713 "name": "raid_bdev1", 00:23:40.713 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:40.713 "strip_size_kb": 0, 00:23:40.713 "state": "online", 00:23:40.713 "raid_level": "raid1", 00:23:40.713 "superblock": true, 00:23:40.713 "num_base_bdevs": 2, 00:23:40.713 "num_base_bdevs_discovered": 1, 00:23:40.713 "num_base_bdevs_operational": 1, 00:23:40.713 "base_bdevs_list": [ 00:23:40.713 { 00:23:40.713 "name": null, 00:23:40.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.713 "is_configured": false, 00:23:40.713 "data_offset": 256, 00:23:40.713 "data_size": 7936 00:23:40.713 }, 00:23:40.713 { 00:23:40.713 "name": "BaseBdev2", 00:23:40.713 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:40.713 "is_configured": true, 00:23:40.713 "data_offset": 256, 00:23:40.713 "data_size": 7936 00:23:40.713 } 00:23:40.713 ] 00:23:40.713 }' 00:23:40.713 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.713 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:41.278 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:41.278 [2024-07-15 13:46:28.802205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:41.278 [2024-07-15 13:46:28.805409] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2052b10 00:23:41.278 [2024-07-15 13:46:28.807053] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:41.278 13:46:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:42.211 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:42.211 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:42.211 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:42.211 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:42.211 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:42.212 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.212 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:42.469 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:42.469 "name": "raid_bdev1", 00:23:42.469 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:42.469 "strip_size_kb": 0, 00:23:42.469 "state": "online", 00:23:42.469 "raid_level": "raid1", 00:23:42.469 "superblock": true, 00:23:42.469 "num_base_bdevs": 2, 00:23:42.469 "num_base_bdevs_discovered": 2, 00:23:42.469 "num_base_bdevs_operational": 2, 00:23:42.469 "process": { 00:23:42.469 "type": "rebuild", 00:23:42.469 "target": "spare", 00:23:42.469 "progress": { 00:23:42.469 "blocks": 2816, 00:23:42.469 "percent": 35 00:23:42.469 } 00:23:42.469 }, 00:23:42.469 "base_bdevs_list": [ 00:23:42.469 { 00:23:42.469 "name": "spare", 00:23:42.469 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:42.469 "is_configured": true, 00:23:42.469 "data_offset": 256, 00:23:42.469 "data_size": 7936 00:23:42.469 }, 00:23:42.469 { 00:23:42.469 "name": "BaseBdev2", 00:23:42.469 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:42.469 "is_configured": true, 00:23:42.469 "data_offset": 256, 00:23:42.469 "data_size": 7936 00:23:42.469 } 00:23:42.469 ] 00:23:42.469 }' 00:23:42.469 13:46:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:42.469 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:42.469 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:42.469 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:42.469 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:42.727 [2024-07-15 13:46:30.207114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:42.727 [2024-07-15 13:46:30.217513] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:42.727 [2024-07-15 13:46:30.217543] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:42.727 [2024-07-15 13:46:30.217569] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:42.727 [2024-07-15 13:46:30.217575] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.727 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:42.985 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:42.985 "name": "raid_bdev1", 00:23:42.985 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:42.985 "strip_size_kb": 0, 00:23:42.985 "state": "online", 00:23:42.985 "raid_level": "raid1", 00:23:42.985 "superblock": true, 00:23:42.985 "num_base_bdevs": 2, 00:23:42.985 "num_base_bdevs_discovered": 1, 00:23:42.985 "num_base_bdevs_operational": 1, 00:23:42.985 "base_bdevs_list": [ 00:23:42.985 { 00:23:42.985 "name": null, 00:23:42.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:42.985 "is_configured": false, 00:23:42.985 "data_offset": 256, 00:23:42.985 "data_size": 7936 00:23:42.985 }, 00:23:42.985 { 00:23:42.985 "name": "BaseBdev2", 00:23:42.985 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:42.985 "is_configured": true, 00:23:42.985 "data_offset": 256, 00:23:42.985 "data_size": 7936 00:23:42.985 } 00:23:42.985 ] 00:23:42.985 }' 00:23:42.985 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:42.985 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:43.553 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:43.553 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:43.553 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:43.553 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:43.553 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:43.553 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.553 13:46:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.553 13:46:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:43.553 "name": "raid_bdev1", 00:23:43.553 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:43.553 "strip_size_kb": 0, 00:23:43.553 "state": "online", 00:23:43.553 "raid_level": "raid1", 00:23:43.553 "superblock": true, 00:23:43.553 "num_base_bdevs": 2, 00:23:43.553 "num_base_bdevs_discovered": 1, 00:23:43.553 "num_base_bdevs_operational": 1, 00:23:43.553 "base_bdevs_list": [ 00:23:43.553 { 00:23:43.553 "name": null, 00:23:43.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.553 "is_configured": false, 00:23:43.553 "data_offset": 256, 00:23:43.553 "data_size": 7936 00:23:43.553 }, 00:23:43.553 { 00:23:43.553 "name": "BaseBdev2", 00:23:43.553 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:43.553 "is_configured": true, 00:23:43.553 "data_offset": 256, 00:23:43.553 "data_size": 7936 00:23:43.553 } 00:23:43.553 ] 00:23:43.553 }' 00:23:43.553 13:46:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:43.553 13:46:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:43.553 13:46:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:43.553 13:46:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:43.553 13:46:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:43.812 [2024-07-15 13:46:31.276421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:43.812 [2024-07-15 13:46:31.280094] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x204eb30 00:23:43.812 [2024-07-15 13:46:31.281154] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:43.812 13:46:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:44.829 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:44.829 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:44.829 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:44.829 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:44.829 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:44.829 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.829 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:45.088 "name": "raid_bdev1", 00:23:45.088 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:45.088 "strip_size_kb": 0, 00:23:45.088 "state": "online", 00:23:45.088 "raid_level": "raid1", 00:23:45.088 "superblock": true, 00:23:45.088 "num_base_bdevs": 2, 00:23:45.088 "num_base_bdevs_discovered": 2, 00:23:45.088 "num_base_bdevs_operational": 2, 00:23:45.088 "process": { 00:23:45.088 "type": "rebuild", 00:23:45.088 "target": "spare", 00:23:45.088 "progress": { 00:23:45.088 "blocks": 2816, 00:23:45.088 "percent": 35 00:23:45.088 } 00:23:45.088 }, 00:23:45.088 "base_bdevs_list": [ 00:23:45.088 { 00:23:45.088 "name": "spare", 00:23:45.088 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:45.088 "is_configured": true, 00:23:45.088 "data_offset": 256, 00:23:45.088 "data_size": 7936 00:23:45.088 }, 00:23:45.088 { 00:23:45.088 "name": "BaseBdev2", 00:23:45.088 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:45.088 "is_configured": true, 00:23:45.088 "data_offset": 256, 00:23:45.088 "data_size": 7936 00:23:45.088 } 00:23:45.088 ] 00:23:45.088 }' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:45.088 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=892 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:45.088 "name": "raid_bdev1", 00:23:45.088 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:45.088 "strip_size_kb": 0, 00:23:45.088 "state": "online", 00:23:45.088 "raid_level": "raid1", 00:23:45.088 "superblock": true, 00:23:45.088 "num_base_bdevs": 2, 00:23:45.088 "num_base_bdevs_discovered": 2, 00:23:45.088 "num_base_bdevs_operational": 2, 00:23:45.088 "process": { 00:23:45.088 "type": "rebuild", 00:23:45.088 "target": "spare", 00:23:45.088 "progress": { 00:23:45.088 "blocks": 3328, 00:23:45.088 "percent": 41 00:23:45.088 } 00:23:45.088 }, 00:23:45.088 "base_bdevs_list": [ 00:23:45.088 { 00:23:45.088 "name": "spare", 00:23:45.088 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:45.088 "is_configured": true, 00:23:45.088 "data_offset": 256, 00:23:45.088 "data_size": 7936 00:23:45.088 }, 00:23:45.088 { 00:23:45.088 "name": "BaseBdev2", 00:23:45.088 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:45.088 "is_configured": true, 00:23:45.088 "data_offset": 256, 00:23:45.088 "data_size": 7936 00:23:45.088 } 00:23:45.088 ] 00:23:45.088 }' 00:23:45.088 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:45.347 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:45.347 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:45.347 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:45.347 13:46:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.282 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.541 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:46.541 "name": "raid_bdev1", 00:23:46.541 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:46.541 "strip_size_kb": 0, 00:23:46.541 "state": "online", 00:23:46.541 "raid_level": "raid1", 00:23:46.541 "superblock": true, 00:23:46.541 "num_base_bdevs": 2, 00:23:46.541 "num_base_bdevs_discovered": 2, 00:23:46.541 "num_base_bdevs_operational": 2, 00:23:46.541 "process": { 00:23:46.541 "type": "rebuild", 00:23:46.541 "target": "spare", 00:23:46.541 "progress": { 00:23:46.541 "blocks": 6656, 00:23:46.541 "percent": 83 00:23:46.541 } 00:23:46.541 }, 00:23:46.541 "base_bdevs_list": [ 00:23:46.541 { 00:23:46.541 "name": "spare", 00:23:46.541 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:46.541 "is_configured": true, 00:23:46.541 "data_offset": 256, 00:23:46.541 "data_size": 7936 00:23:46.541 }, 00:23:46.541 { 00:23:46.541 "name": "BaseBdev2", 00:23:46.541 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:46.541 "is_configured": true, 00:23:46.541 "data_offset": 256, 00:23:46.541 "data_size": 7936 00:23:46.541 } 00:23:46.541 ] 00:23:46.541 }' 00:23:46.541 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:46.541 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:46.541 13:46:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:46.541 13:46:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:46.541 13:46:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:46.801 [2024-07-15 13:46:34.404036] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:46.801 [2024-07-15 13:46:34.404076] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:46.801 [2024-07-15 13:46:34.404138] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.737 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.737 "name": "raid_bdev1", 00:23:47.737 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:47.737 "strip_size_kb": 0, 00:23:47.737 "state": "online", 00:23:47.737 "raid_level": "raid1", 00:23:47.737 "superblock": true, 00:23:47.737 "num_base_bdevs": 2, 00:23:47.737 "num_base_bdevs_discovered": 2, 00:23:47.737 "num_base_bdevs_operational": 2, 00:23:47.737 "base_bdevs_list": [ 00:23:47.737 { 00:23:47.737 "name": "spare", 00:23:47.737 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:47.737 "is_configured": true, 00:23:47.737 "data_offset": 256, 00:23:47.737 "data_size": 7936 00:23:47.737 }, 00:23:47.737 { 00:23:47.737 "name": "BaseBdev2", 00:23:47.737 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:47.737 "is_configured": true, 00:23:47.737 "data_offset": 256, 00:23:47.738 "data_size": 7936 00:23:47.738 } 00:23:47.738 ] 00:23:47.738 }' 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.738 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.997 "name": "raid_bdev1", 00:23:47.997 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:47.997 "strip_size_kb": 0, 00:23:47.997 "state": "online", 00:23:47.997 "raid_level": "raid1", 00:23:47.997 "superblock": true, 00:23:47.997 "num_base_bdevs": 2, 00:23:47.997 "num_base_bdevs_discovered": 2, 00:23:47.997 "num_base_bdevs_operational": 2, 00:23:47.997 "base_bdevs_list": [ 00:23:47.997 { 00:23:47.997 "name": "spare", 00:23:47.997 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:47.997 "is_configured": true, 00:23:47.997 "data_offset": 256, 00:23:47.997 "data_size": 7936 00:23:47.997 }, 00:23:47.997 { 00:23:47.997 "name": "BaseBdev2", 00:23:47.997 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:47.997 "is_configured": true, 00:23:47.997 "data_offset": 256, 00:23:47.997 "data_size": 7936 00:23:47.997 } 00:23:47.997 ] 00:23:47.997 }' 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.997 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.256 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.256 "name": "raid_bdev1", 00:23:48.256 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:48.256 "strip_size_kb": 0, 00:23:48.256 "state": "online", 00:23:48.256 "raid_level": "raid1", 00:23:48.256 "superblock": true, 00:23:48.256 "num_base_bdevs": 2, 00:23:48.256 "num_base_bdevs_discovered": 2, 00:23:48.256 "num_base_bdevs_operational": 2, 00:23:48.256 "base_bdevs_list": [ 00:23:48.256 { 00:23:48.256 "name": "spare", 00:23:48.256 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:48.256 "is_configured": true, 00:23:48.256 "data_offset": 256, 00:23:48.256 "data_size": 7936 00:23:48.256 }, 00:23:48.256 { 00:23:48.256 "name": "BaseBdev2", 00:23:48.256 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:48.256 "is_configured": true, 00:23:48.256 "data_offset": 256, 00:23:48.256 "data_size": 7936 00:23:48.256 } 00:23:48.256 ] 00:23:48.256 }' 00:23:48.256 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.256 13:46:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:48.822 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:48.822 [2024-07-15 13:46:36.288254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:48.822 [2024-07-15 13:46:36.288277] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:48.822 [2024-07-15 13:46:36.288318] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:48.822 [2024-07-15 13:46:36.288356] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:48.823 [2024-07-15 13:46:36.288364] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2052c30 name raid_bdev1, state offline 00:23:48.823 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.823 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:23:49.081 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:49.081 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:23:49.081 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:49.081 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:49.081 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:49.341 [2024-07-15 13:46:36.785523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:49.341 [2024-07-15 13:46:36.785554] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.341 [2024-07-15 13:46:36.785569] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2052900 00:23:49.341 [2024-07-15 13:46:36.785576] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.341 [2024-07-15 13:46:36.786651] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.341 [2024-07-15 13:46:36.786673] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:49.341 [2024-07-15 13:46:36.786716] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:49.341 [2024-07-15 13:46:36.786740] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:49.341 [2024-07-15 13:46:36.786804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:49.341 spare 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.341 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.341 [2024-07-15 13:46:36.887095] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2053820 00:23:49.341 [2024-07-15 13:46:36.887111] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:49.341 [2024-07-15 13:46:36.887183] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2047dc0 00:23:49.341 [2024-07-15 13:46:36.887257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2053820 00:23:49.341 [2024-07-15 13:46:36.887264] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2053820 00:23:49.341 [2024-07-15 13:46:36.887314] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:49.600 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.600 "name": "raid_bdev1", 00:23:49.600 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:49.600 "strip_size_kb": 0, 00:23:49.600 "state": "online", 00:23:49.600 "raid_level": "raid1", 00:23:49.600 "superblock": true, 00:23:49.600 "num_base_bdevs": 2, 00:23:49.600 "num_base_bdevs_discovered": 2, 00:23:49.600 "num_base_bdevs_operational": 2, 00:23:49.600 "base_bdevs_list": [ 00:23:49.600 { 00:23:49.600 "name": "spare", 00:23:49.600 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:49.600 "is_configured": true, 00:23:49.600 "data_offset": 256, 00:23:49.600 "data_size": 7936 00:23:49.600 }, 00:23:49.600 { 00:23:49.600 "name": "BaseBdev2", 00:23:49.600 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:49.600 "is_configured": true, 00:23:49.600 "data_offset": 256, 00:23:49.600 "data_size": 7936 00:23:49.600 } 00:23:49.600 ] 00:23:49.600 }' 00:23:49.600 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.600 13:46:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:49.860 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:49.860 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.860 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:49.860 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:49.860 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.860 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.860 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.118 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.118 "name": "raid_bdev1", 00:23:50.118 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:50.118 "strip_size_kb": 0, 00:23:50.118 "state": "online", 00:23:50.118 "raid_level": "raid1", 00:23:50.118 "superblock": true, 00:23:50.118 "num_base_bdevs": 2, 00:23:50.118 "num_base_bdevs_discovered": 2, 00:23:50.118 "num_base_bdevs_operational": 2, 00:23:50.118 "base_bdevs_list": [ 00:23:50.118 { 00:23:50.118 "name": "spare", 00:23:50.118 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:50.118 "is_configured": true, 00:23:50.118 "data_offset": 256, 00:23:50.118 "data_size": 7936 00:23:50.118 }, 00:23:50.118 { 00:23:50.118 "name": "BaseBdev2", 00:23:50.118 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:50.118 "is_configured": true, 00:23:50.118 "data_offset": 256, 00:23:50.118 "data_size": 7936 00:23:50.118 } 00:23:50.118 ] 00:23:50.118 }' 00:23:50.118 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.118 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:50.118 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.118 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:50.118 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.118 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:50.377 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:50.377 13:46:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:50.635 [2024-07-15 13:46:38.076934] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.635 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.894 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.894 "name": "raid_bdev1", 00:23:50.894 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:50.894 "strip_size_kb": 0, 00:23:50.894 "state": "online", 00:23:50.894 "raid_level": "raid1", 00:23:50.894 "superblock": true, 00:23:50.894 "num_base_bdevs": 2, 00:23:50.894 "num_base_bdevs_discovered": 1, 00:23:50.894 "num_base_bdevs_operational": 1, 00:23:50.894 "base_bdevs_list": [ 00:23:50.894 { 00:23:50.894 "name": null, 00:23:50.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.894 "is_configured": false, 00:23:50.894 "data_offset": 256, 00:23:50.894 "data_size": 7936 00:23:50.894 }, 00:23:50.894 { 00:23:50.894 "name": "BaseBdev2", 00:23:50.894 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:50.894 "is_configured": true, 00:23:50.894 "data_offset": 256, 00:23:50.894 "data_size": 7936 00:23:50.894 } 00:23:50.894 ] 00:23:50.894 }' 00:23:50.894 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.894 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:51.152 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:51.411 [2024-07-15 13:46:38.911116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:51.411 [2024-07-15 13:46:38.911231] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:51.411 [2024-07-15 13:46:38.911244] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:51.411 [2024-07-15 13:46:38.911265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:51.411 [2024-07-15 13:46:38.914394] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2053da0 00:23:51.411 [2024-07-15 13:46:38.915432] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:51.411 13:46:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:52.347 13:46:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:52.347 13:46:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.347 13:46:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:52.347 13:46:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:52.347 13:46:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.347 13:46:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.347 13:46:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.605 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.605 "name": "raid_bdev1", 00:23:52.605 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:52.605 "strip_size_kb": 0, 00:23:52.605 "state": "online", 00:23:52.605 "raid_level": "raid1", 00:23:52.605 "superblock": true, 00:23:52.605 "num_base_bdevs": 2, 00:23:52.605 "num_base_bdevs_discovered": 2, 00:23:52.605 "num_base_bdevs_operational": 2, 00:23:52.605 "process": { 00:23:52.605 "type": "rebuild", 00:23:52.605 "target": "spare", 00:23:52.605 "progress": { 00:23:52.605 "blocks": 2816, 00:23:52.605 "percent": 35 00:23:52.605 } 00:23:52.605 }, 00:23:52.605 "base_bdevs_list": [ 00:23:52.605 { 00:23:52.605 "name": "spare", 00:23:52.605 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:52.605 "is_configured": true, 00:23:52.605 "data_offset": 256, 00:23:52.605 "data_size": 7936 00:23:52.605 }, 00:23:52.605 { 00:23:52.605 "name": "BaseBdev2", 00:23:52.605 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:52.605 "is_configured": true, 00:23:52.605 "data_offset": 256, 00:23:52.605 "data_size": 7936 00:23:52.605 } 00:23:52.605 ] 00:23:52.605 }' 00:23:52.605 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.605 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:52.605 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.605 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:52.605 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:52.863 [2024-07-15 13:46:40.363922] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:52.863 [2024-07-15 13:46:40.426337] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:52.863 [2024-07-15 13:46:40.426376] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:52.863 [2024-07-15 13:46:40.426387] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:52.863 [2024-07-15 13:46:40.426393] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.863 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.120 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.120 "name": "raid_bdev1", 00:23:53.120 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:53.120 "strip_size_kb": 0, 00:23:53.120 "state": "online", 00:23:53.120 "raid_level": "raid1", 00:23:53.120 "superblock": true, 00:23:53.120 "num_base_bdevs": 2, 00:23:53.120 "num_base_bdevs_discovered": 1, 00:23:53.120 "num_base_bdevs_operational": 1, 00:23:53.120 "base_bdevs_list": [ 00:23:53.120 { 00:23:53.120 "name": null, 00:23:53.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.120 "is_configured": false, 00:23:53.120 "data_offset": 256, 00:23:53.120 "data_size": 7936 00:23:53.120 }, 00:23:53.120 { 00:23:53.120 "name": "BaseBdev2", 00:23:53.120 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:53.120 "is_configured": true, 00:23:53.120 "data_offset": 256, 00:23:53.120 "data_size": 7936 00:23:53.120 } 00:23:53.120 ] 00:23:53.120 }' 00:23:53.121 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.121 13:46:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:53.685 13:46:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:53.685 [2024-07-15 13:46:41.288702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:53.685 [2024-07-15 13:46:41.288743] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.685 [2024-07-15 13:46:41.288759] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2053ba0 00:23:53.685 [2024-07-15 13:46:41.288768] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.685 [2024-07-15 13:46:41.288916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.685 [2024-07-15 13:46:41.288927] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:53.685 [2024-07-15 13:46:41.288968] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:53.685 [2024-07-15 13:46:41.288976] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:53.685 [2024-07-15 13:46:41.288983] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:53.685 [2024-07-15 13:46:41.289002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:53.685 [2024-07-15 13:46:41.292169] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eb9260 00:23:53.685 [2024-07-15 13:46:41.293140] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:53.685 spare 00:23:53.943 13:46:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:54.876 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:54.876 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.876 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:54.876 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:54.876 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.876 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.876 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.134 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:55.134 "name": "raid_bdev1", 00:23:55.134 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:55.134 "strip_size_kb": 0, 00:23:55.134 "state": "online", 00:23:55.134 "raid_level": "raid1", 00:23:55.134 "superblock": true, 00:23:55.134 "num_base_bdevs": 2, 00:23:55.134 "num_base_bdevs_discovered": 2, 00:23:55.134 "num_base_bdevs_operational": 2, 00:23:55.134 "process": { 00:23:55.134 "type": "rebuild", 00:23:55.134 "target": "spare", 00:23:55.134 "progress": { 00:23:55.134 "blocks": 2816, 00:23:55.134 "percent": 35 00:23:55.134 } 00:23:55.134 }, 00:23:55.134 "base_bdevs_list": [ 00:23:55.134 { 00:23:55.134 "name": "spare", 00:23:55.134 "uuid": "42d77839-efe7-5de2-a616-04884a7bfcec", 00:23:55.134 "is_configured": true, 00:23:55.134 "data_offset": 256, 00:23:55.134 "data_size": 7936 00:23:55.134 }, 00:23:55.134 { 00:23:55.134 "name": "BaseBdev2", 00:23:55.134 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:55.134 "is_configured": true, 00:23:55.134 "data_offset": 256, 00:23:55.134 "data_size": 7936 00:23:55.134 } 00:23:55.134 ] 00:23:55.134 }' 00:23:55.134 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:55.134 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:55.134 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:55.134 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:55.134 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:55.134 [2024-07-15 13:46:42.713678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:55.392 [2024-07-15 13:46:42.804608] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:55.392 [2024-07-15 13:46:42.804639] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:55.392 [2024-07-15 13:46:42.804665] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:55.392 [2024-07-15 13:46:42.804671] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:55.392 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:55.392 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.392 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.392 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.392 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.392 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.393 "name": "raid_bdev1", 00:23:55.393 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:55.393 "strip_size_kb": 0, 00:23:55.393 "state": "online", 00:23:55.393 "raid_level": "raid1", 00:23:55.393 "superblock": true, 00:23:55.393 "num_base_bdevs": 2, 00:23:55.393 "num_base_bdevs_discovered": 1, 00:23:55.393 "num_base_bdevs_operational": 1, 00:23:55.393 "base_bdevs_list": [ 00:23:55.393 { 00:23:55.393 "name": null, 00:23:55.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.393 "is_configured": false, 00:23:55.393 "data_offset": 256, 00:23:55.393 "data_size": 7936 00:23:55.393 }, 00:23:55.393 { 00:23:55.393 "name": "BaseBdev2", 00:23:55.393 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:55.393 "is_configured": true, 00:23:55.393 "data_offset": 256, 00:23:55.393 "data_size": 7936 00:23:55.393 } 00:23:55.393 ] 00:23:55.393 }' 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.393 13:46:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:55.959 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:55.959 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:55.959 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:55.959 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:55.959 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:55.959 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.959 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.215 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.215 "name": "raid_bdev1", 00:23:56.215 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:56.215 "strip_size_kb": 0, 00:23:56.215 "state": "online", 00:23:56.215 "raid_level": "raid1", 00:23:56.215 "superblock": true, 00:23:56.215 "num_base_bdevs": 2, 00:23:56.215 "num_base_bdevs_discovered": 1, 00:23:56.215 "num_base_bdevs_operational": 1, 00:23:56.215 "base_bdevs_list": [ 00:23:56.215 { 00:23:56.215 "name": null, 00:23:56.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.215 "is_configured": false, 00:23:56.215 "data_offset": 256, 00:23:56.215 "data_size": 7936 00:23:56.215 }, 00:23:56.215 { 00:23:56.215 "name": "BaseBdev2", 00:23:56.215 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:56.215 "is_configured": true, 00:23:56.215 "data_offset": 256, 00:23:56.215 "data_size": 7936 00:23:56.215 } 00:23:56.215 ] 00:23:56.215 }' 00:23:56.215 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.215 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:56.215 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.215 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.215 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:56.472 13:46:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:56.472 [2024-07-15 13:46:44.079974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:56.472 [2024-07-15 13:46:44.080031] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:56.472 [2024-07-15 13:46:44.080050] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eba860 00:23:56.472 [2024-07-15 13:46:44.080059] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:56.472 [2024-07-15 13:46:44.080189] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:56.473 [2024-07-15 13:46:44.080200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:56.473 [2024-07-15 13:46:44.080236] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:56.473 [2024-07-15 13:46:44.080245] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:56.473 [2024-07-15 13:46:44.080252] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:56.473 BaseBdev1 00:23:56.730 13:46:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:57.667 "name": "raid_bdev1", 00:23:57.667 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:57.667 "strip_size_kb": 0, 00:23:57.667 "state": "online", 00:23:57.667 "raid_level": "raid1", 00:23:57.667 "superblock": true, 00:23:57.667 "num_base_bdevs": 2, 00:23:57.667 "num_base_bdevs_discovered": 1, 00:23:57.667 "num_base_bdevs_operational": 1, 00:23:57.667 "base_bdevs_list": [ 00:23:57.667 { 00:23:57.667 "name": null, 00:23:57.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.667 "is_configured": false, 00:23:57.667 "data_offset": 256, 00:23:57.667 "data_size": 7936 00:23:57.667 }, 00:23:57.667 { 00:23:57.667 "name": "BaseBdev2", 00:23:57.667 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:57.667 "is_configured": true, 00:23:57.667 "data_offset": 256, 00:23:57.667 "data_size": 7936 00:23:57.667 } 00:23:57.667 ] 00:23:57.667 }' 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:57.667 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:58.235 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:58.235 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:58.235 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:58.235 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:58.235 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:58.235 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.235 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.494 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:58.494 "name": "raid_bdev1", 00:23:58.494 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:58.494 "strip_size_kb": 0, 00:23:58.494 "state": "online", 00:23:58.494 "raid_level": "raid1", 00:23:58.494 "superblock": true, 00:23:58.494 "num_base_bdevs": 2, 00:23:58.494 "num_base_bdevs_discovered": 1, 00:23:58.494 "num_base_bdevs_operational": 1, 00:23:58.494 "base_bdevs_list": [ 00:23:58.494 { 00:23:58.494 "name": null, 00:23:58.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.494 "is_configured": false, 00:23:58.494 "data_offset": 256, 00:23:58.494 "data_size": 7936 00:23:58.494 }, 00:23:58.494 { 00:23:58.494 "name": "BaseBdev2", 00:23:58.494 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:58.494 "is_configured": true, 00:23:58.494 "data_offset": 256, 00:23:58.494 "data_size": 7936 00:23:58.494 } 00:23:58.494 ] 00:23:58.494 }' 00:23:58.494 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:58.494 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:58.494 13:46:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:58.494 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:58.495 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:58.753 [2024-07-15 13:46:46.177510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:58.753 [2024-07-15 13:46:46.177610] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:58.753 [2024-07-15 13:46:46.177620] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:58.753 request: 00:23:58.753 { 00:23:58.753 "base_bdev": "BaseBdev1", 00:23:58.753 "raid_bdev": "raid_bdev1", 00:23:58.753 "method": "bdev_raid_add_base_bdev", 00:23:58.753 "req_id": 1 00:23:58.753 } 00:23:58.753 Got JSON-RPC error response 00:23:58.753 response: 00:23:58.753 { 00:23:58.753 "code": -22, 00:23:58.753 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:58.753 } 00:23:58.753 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:23:58.753 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:58.753 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:58.753 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:58.753 13:46:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.705 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.963 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.963 "name": "raid_bdev1", 00:23:59.963 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:23:59.963 "strip_size_kb": 0, 00:23:59.963 "state": "online", 00:23:59.963 "raid_level": "raid1", 00:23:59.963 "superblock": true, 00:23:59.963 "num_base_bdevs": 2, 00:23:59.963 "num_base_bdevs_discovered": 1, 00:23:59.963 "num_base_bdevs_operational": 1, 00:23:59.963 "base_bdevs_list": [ 00:23:59.963 { 00:23:59.963 "name": null, 00:23:59.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.963 "is_configured": false, 00:23:59.963 "data_offset": 256, 00:23:59.963 "data_size": 7936 00:23:59.963 }, 00:23:59.963 { 00:23:59.963 "name": "BaseBdev2", 00:23:59.963 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:23:59.963 "is_configured": true, 00:23:59.963 "data_offset": 256, 00:23:59.963 "data_size": 7936 00:23:59.963 } 00:23:59.963 ] 00:23:59.963 }' 00:23:59.963 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.963 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:00.529 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:00.529 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:00.529 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:00.529 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:00.529 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:00.529 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.529 13:46:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.529 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:00.529 "name": "raid_bdev1", 00:24:00.529 "uuid": "0041293b-3dce-4545-8d80-4cac34bad3fa", 00:24:00.529 "strip_size_kb": 0, 00:24:00.529 "state": "online", 00:24:00.529 "raid_level": "raid1", 00:24:00.529 "superblock": true, 00:24:00.529 "num_base_bdevs": 2, 00:24:00.529 "num_base_bdevs_discovered": 1, 00:24:00.529 "num_base_bdevs_operational": 1, 00:24:00.529 "base_bdevs_list": [ 00:24:00.529 { 00:24:00.529 "name": null, 00:24:00.529 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.529 "is_configured": false, 00:24:00.529 "data_offset": 256, 00:24:00.529 "data_size": 7936 00:24:00.529 }, 00:24:00.529 { 00:24:00.529 "name": "BaseBdev2", 00:24:00.529 "uuid": "a2fed592-728a-56ed-820e-2ab68f106f54", 00:24:00.529 "is_configured": true, 00:24:00.529 "data_offset": 256, 00:24:00.529 "data_size": 7936 00:24:00.529 } 00:24:00.529 ] 00:24:00.529 }' 00:24:00.529 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:00.529 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:00.529 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 103021 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 103021 ']' 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 103021 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 103021 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 103021' 00:24:00.788 killing process with pid 103021 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 103021 00:24:00.788 Received shutdown signal, test time was about 60.000000 seconds 00:24:00.788 00:24:00.788 Latency(us) 00:24:00.788 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:00.788 =================================================================================================================== 00:24:00.788 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:00.788 [2024-07-15 13:46:48.198065] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:00.788 [2024-07-15 13:46:48.198130] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.788 [2024-07-15 13:46:48.198162] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.788 [2024-07-15 13:46:48.198170] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2053820 name raid_bdev1, state offline 00:24:00.788 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 103021 00:24:00.788 [2024-07-15 13:46:48.228894] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:01.046 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:24:01.046 00:24:01.046 real 0m23.846s 00:24:01.046 user 0m36.530s 00:24:01.046 sys 0m3.098s 00:24:01.046 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:01.046 13:46:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:01.046 ************************************ 00:24:01.046 END TEST raid_rebuild_test_sb_md_interleaved 00:24:01.046 ************************************ 00:24:01.046 13:46:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:01.046 13:46:48 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:24:01.046 13:46:48 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:24:01.046 13:46:48 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 103021 ']' 00:24:01.046 13:46:48 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 103021 00:24:01.046 13:46:48 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:24:01.046 00:24:01.046 real 14m37.584s 00:24:01.046 user 24m13.767s 00:24:01.046 sys 2m47.332s 00:24:01.046 13:46:48 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:01.046 13:46:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:01.046 ************************************ 00:24:01.046 END TEST bdev_raid 00:24:01.046 ************************************ 00:24:01.046 13:46:48 -- common/autotest_common.sh@1142 -- # return 0 00:24:01.046 13:46:48 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:24:01.046 13:46:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:24:01.046 13:46:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:01.046 13:46:48 -- common/autotest_common.sh@10 -- # set +x 00:24:01.046 ************************************ 00:24:01.046 START TEST bdevperf_config 00:24:01.046 ************************************ 00:24:01.046 13:46:48 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:24:01.304 * Looking for test storage... 00:24:01.304 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:01.304 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:24:01.304 13:46:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:01.304 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:01.305 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:01.305 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:01.305 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:01.305 13:46:48 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:03.836 13:46:51 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 13:46:48.776682] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:03.836 [2024-07-15 13:46:48.776750] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106555 ] 00:24:03.836 Using job config with 4 jobs 00:24:03.836 [2024-07-15 13:46:48.870252] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:03.836 [2024-07-15 13:46:48.968193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:03.836 cpumask for '\''job0'\'' is too big 00:24:03.836 cpumask for '\''job1'\'' is too big 00:24:03.836 cpumask for '\''job2'\'' is too big 00:24:03.836 cpumask for '\''job3'\'' is too big 00:24:03.836 Running I/O for 2 seconds... 00:24:03.836 00:24:03.836 Latency(us) 00:24:03.836 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:03.836 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.836 Malloc0 : 2.01 37193.31 36.32 0.00 0.00 6876.28 1303.60 10656.72 00:24:03.836 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.836 Malloc0 : 2.01 37170.82 36.30 0.00 0.00 6869.39 1253.73 9516.97 00:24:03.836 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.02 37210.59 36.34 0.00 0.00 6851.95 1210.99 9232.03 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.02 37188.27 36.32 0.00 0.00 6846.83 1232.36 9232.03 00:24:03.837 =================================================================================================================== 00:24:03.837 Total : 148762.99 145.28 0.00 0.00 6861.09 1210.99 10656.72' 00:24:03.837 13:46:51 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 13:46:48.776682] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:03.837 [2024-07-15 13:46:48.776750] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106555 ] 00:24:03.837 Using job config with 4 jobs 00:24:03.837 [2024-07-15 13:46:48.870252] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:03.837 [2024-07-15 13:46:48.968193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:03.837 cpumask for '\''job0'\'' is too big 00:24:03.837 cpumask for '\''job1'\'' is too big 00:24:03.837 cpumask for '\''job2'\'' is too big 00:24:03.837 cpumask for '\''job3'\'' is too big 00:24:03.837 Running I/O for 2 seconds... 00:24:03.837 00:24:03.837 Latency(us) 00:24:03.837 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.01 37193.31 36.32 0.00 0.00 6876.28 1303.60 10656.72 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.01 37170.82 36.30 0.00 0.00 6869.39 1253.73 9516.97 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.02 37210.59 36.34 0.00 0.00 6851.95 1210.99 9232.03 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.02 37188.27 36.32 0.00 0.00 6846.83 1232.36 9232.03 00:24:03.837 =================================================================================================================== 00:24:03.837 Total : 148762.99 145.28 0.00 0.00 6861.09 1210.99 10656.72' 00:24:03.837 13:46:51 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 13:46:48.776682] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:03.837 [2024-07-15 13:46:48.776750] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106555 ] 00:24:03.837 Using job config with 4 jobs 00:24:03.837 [2024-07-15 13:46:48.870252] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:03.837 [2024-07-15 13:46:48.968193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:03.837 cpumask for '\''job0'\'' is too big 00:24:03.837 cpumask for '\''job1'\'' is too big 00:24:03.837 cpumask for '\''job2'\'' is too big 00:24:03.837 cpumask for '\''job3'\'' is too big 00:24:03.837 Running I/O for 2 seconds... 00:24:03.837 00:24:03.837 Latency(us) 00:24:03.837 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.01 37193.31 36.32 0.00 0.00 6876.28 1303.60 10656.72 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.01 37170.82 36.30 0.00 0.00 6869.39 1253.73 9516.97 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.02 37210.59 36.34 0.00 0.00 6851.95 1210.99 9232.03 00:24:03.837 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:03.837 Malloc0 : 2.02 37188.27 36.32 0.00 0.00 6846.83 1232.36 9232.03 00:24:03.837 =================================================================================================================== 00:24:03.837 Total : 148762.99 145.28 0.00 0.00 6861.09 1210.99 10656.72' 00:24:03.837 13:46:51 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:24:03.837 13:46:51 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:24:03.837 13:46:51 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:24:03.837 13:46:51 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:03.837 [2024-07-15 13:46:51.409405] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:03.837 [2024-07-15 13:46:51.409456] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106861 ] 00:24:04.095 [2024-07-15 13:46:51.503513] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:04.095 [2024-07-15 13:46:51.599781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:04.353 cpumask for 'job0' is too big 00:24:04.353 cpumask for 'job1' is too big 00:24:04.353 cpumask for 'job2' is too big 00:24:04.353 cpumask for 'job3' is too big 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:24:06.899 Running I/O for 2 seconds... 00:24:06.899 00:24:06.899 Latency(us) 00:24:06.899 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:06.899 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:06.899 Malloc0 : 2.01 37287.34 36.41 0.00 0.00 6863.14 1253.73 10257.81 00:24:06.899 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:06.899 Malloc0 : 2.01 37266.06 36.39 0.00 0.00 6857.08 1168.25 9061.06 00:24:06.899 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:06.899 Malloc0 : 2.01 37244.96 36.37 0.00 0.00 6851.88 1161.13 7978.30 00:24:06.899 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:06.899 Malloc0 : 2.02 37223.87 36.35 0.00 0.00 6846.07 1154.00 7522.39 00:24:06.899 =================================================================================================================== 00:24:06.899 Total : 149022.23 145.53 0.00 0.00 6854.54 1154.00 10257.81' 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:06.899 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:06.899 00:24:06.899 13:46:53 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:06.899 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:06.899 13:46:54 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 13:46:54.062903] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:09.431 [2024-07-15 13:46:54.062953] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107202 ] 00:24:09.431 Using job config with 3 jobs 00:24:09.431 [2024-07-15 13:46:54.157316] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:09.431 [2024-07-15 13:46:54.252640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:09.431 cpumask for '\''job0'\'' is too big 00:24:09.431 cpumask for '\''job1'\'' is too big 00:24:09.431 cpumask for '\''job2'\'' is too big 00:24:09.431 Running I/O for 2 seconds... 00:24:09.431 00:24:09.431 Latency(us) 00:24:09.431 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51264.96 50.06 0.00 0.00 4986.86 1360.58 8605.16 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51233.37 50.03 0.00 0.00 4981.74 1360.58 7265.95 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51287.43 50.09 0.00 0.00 4967.93 577.00 6097.70 00:24:09.431 =================================================================================================================== 00:24:09.431 Total : 153785.76 150.18 0.00 0.00 4978.83 577.00 8605.16' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 13:46:54.062903] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:09.431 [2024-07-15 13:46:54.062953] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107202 ] 00:24:09.431 Using job config with 3 jobs 00:24:09.431 [2024-07-15 13:46:54.157316] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:09.431 [2024-07-15 13:46:54.252640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:09.431 cpumask for '\''job0'\'' is too big 00:24:09.431 cpumask for '\''job1'\'' is too big 00:24:09.431 cpumask for '\''job2'\'' is too big 00:24:09.431 Running I/O for 2 seconds... 00:24:09.431 00:24:09.431 Latency(us) 00:24:09.431 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51264.96 50.06 0.00 0.00 4986.86 1360.58 8605.16 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51233.37 50.03 0.00 0.00 4981.74 1360.58 7265.95 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51287.43 50.09 0.00 0.00 4967.93 577.00 6097.70 00:24:09.431 =================================================================================================================== 00:24:09.431 Total : 153785.76 150.18 0.00 0.00 4978.83 577.00 8605.16' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 13:46:54.062903] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:09.431 [2024-07-15 13:46:54.062953] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107202 ] 00:24:09.431 Using job config with 3 jobs 00:24:09.431 [2024-07-15 13:46:54.157316] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:09.431 [2024-07-15 13:46:54.252640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:09.431 cpumask for '\''job0'\'' is too big 00:24:09.431 cpumask for '\''job1'\'' is too big 00:24:09.431 cpumask for '\''job2'\'' is too big 00:24:09.431 Running I/O for 2 seconds... 00:24:09.431 00:24:09.431 Latency(us) 00:24:09.431 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51264.96 50.06 0.00 0.00 4986.86 1360.58 8605.16 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51233.37 50.03 0.00 0.00 4981.74 1360.58 7265.95 00:24:09.431 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:09.431 Malloc0 : 2.01 51287.43 50.09 0.00 0.00 4967.93 577.00 6097.70 00:24:09.431 =================================================================================================================== 00:24:09.431 Total : 153785.76 150.18 0.00 0.00 4978.83 577.00 8605.16' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:09.431 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:09.431 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:09.431 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:09.431 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:09.431 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:09.431 13:46:56 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:11.964 13:46:59 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 13:46:56.691062] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:11.964 [2024-07-15 13:46:56.691112] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107568 ] 00:24:11.964 Using job config with 4 jobs 00:24:11.964 [2024-07-15 13:46:56.779990] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:11.964 [2024-07-15 13:46:56.875339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:11.964 cpumask for '\''job0'\'' is too big 00:24:11.964 cpumask for '\''job1'\'' is too big 00:24:11.964 cpumask for '\''job2'\'' is too big 00:24:11.964 cpumask for '\''job3'\'' is too big 00:24:11.964 Running I/O for 2 seconds... 00:24:11.964 00:24:11.964 Latency(us) 00:24:11.964 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.02 18979.95 18.54 0.00 0.00 13479.58 2578.70 21541.40 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.02 18968.86 18.52 0.00 0.00 13479.10 3134.33 21541.40 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18958.18 18.51 0.00 0.00 13454.24 2550.21 18919.96 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.03 18947.15 18.50 0.00 0.00 13453.68 3091.59 18919.96 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18936.49 18.49 0.00 0.00 13429.38 2393.49 16526.47 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.03 18925.58 18.48 0.00 0.00 13429.47 2934.87 16640.45 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18914.86 18.47 0.00 0.00 13407.57 2393.49 14588.88 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.03 18903.97 18.46 0.00 0.00 13406.60 2963.37 14588.88 00:24:11.964 =================================================================================================================== 00:24:11.964 Total : 151535.04 147.98 0.00 0.00 13442.45 2393.49 21541.40' 00:24:11.964 13:46:59 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 13:46:56.691062] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:11.964 [2024-07-15 13:46:56.691112] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107568 ] 00:24:11.964 Using job config with 4 jobs 00:24:11.964 [2024-07-15 13:46:56.779990] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:11.964 [2024-07-15 13:46:56.875339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:11.964 cpumask for '\''job0'\'' is too big 00:24:11.964 cpumask for '\''job1'\'' is too big 00:24:11.964 cpumask for '\''job2'\'' is too big 00:24:11.964 cpumask for '\''job3'\'' is too big 00:24:11.964 Running I/O for 2 seconds... 00:24:11.964 00:24:11.964 Latency(us) 00:24:11.964 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.02 18979.95 18.54 0.00 0.00 13479.58 2578.70 21541.40 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.02 18968.86 18.52 0.00 0.00 13479.10 3134.33 21541.40 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18958.18 18.51 0.00 0.00 13454.24 2550.21 18919.96 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.03 18947.15 18.50 0.00 0.00 13453.68 3091.59 18919.96 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18936.49 18.49 0.00 0.00 13429.38 2393.49 16526.47 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.03 18925.58 18.48 0.00 0.00 13429.47 2934.87 16640.45 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18914.86 18.47 0.00 0.00 13407.57 2393.49 14588.88 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.03 18903.97 18.46 0.00 0.00 13406.60 2963.37 14588.88 00:24:11.964 =================================================================================================================== 00:24:11.964 Total : 151535.04 147.98 0.00 0.00 13442.45 2393.49 21541.40' 00:24:11.964 13:46:59 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 13:46:56.691062] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:11.964 [2024-07-15 13:46:56.691112] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107568 ] 00:24:11.964 Using job config with 4 jobs 00:24:11.964 [2024-07-15 13:46:56.779990] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:11.964 [2024-07-15 13:46:56.875339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:11.964 cpumask for '\''job0'\'' is too big 00:24:11.964 cpumask for '\''job1'\'' is too big 00:24:11.964 cpumask for '\''job2'\'' is too big 00:24:11.964 cpumask for '\''job3'\'' is too big 00:24:11.964 Running I/O for 2 seconds... 00:24:11.964 00:24:11.964 Latency(us) 00:24:11.964 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.02 18979.95 18.54 0.00 0.00 13479.58 2578.70 21541.40 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.02 18968.86 18.52 0.00 0.00 13479.10 3134.33 21541.40 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18958.18 18.51 0.00 0.00 13454.24 2550.21 18919.96 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc1 : 2.03 18947.15 18.50 0.00 0.00 13453.68 3091.59 18919.96 00:24:11.964 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.964 Malloc0 : 2.03 18936.49 18.49 0.00 0.00 13429.38 2393.49 16526.47 00:24:11.964 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.965 Malloc1 : 2.03 18925.58 18.48 0.00 0.00 13429.47 2934.87 16640.45 00:24:11.965 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.965 Malloc0 : 2.03 18914.86 18.47 0.00 0.00 13407.57 2393.49 14588.88 00:24:11.965 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:11.965 Malloc1 : 2.03 18903.97 18.46 0.00 0.00 13406.60 2963.37 14588.88 00:24:11.965 =================================================================================================================== 00:24:11.965 Total : 151535.04 147.98 0.00 0.00 13442.45 2393.49 21541.40' 00:24:11.965 13:46:59 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:24:11.965 13:46:59 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:24:11.965 13:46:59 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:24:11.965 13:46:59 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:24:11.965 13:46:59 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:11.965 13:46:59 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:24:11.965 00:24:11.965 real 0m10.713s 00:24:11.965 user 0m9.588s 00:24:11.965 sys 0m1.002s 00:24:11.965 13:46:59 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:11.965 13:46:59 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:24:11.965 ************************************ 00:24:11.965 END TEST bdevperf_config 00:24:11.965 ************************************ 00:24:11.965 13:46:59 -- common/autotest_common.sh@1142 -- # return 0 00:24:11.965 13:46:59 -- spdk/autotest.sh@192 -- # uname -s 00:24:11.965 13:46:59 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:24:11.965 13:46:59 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:24:11.965 13:46:59 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:24:11.965 13:46:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:11.965 13:46:59 -- common/autotest_common.sh@10 -- # set +x 00:24:11.965 ************************************ 00:24:11.965 START TEST reactor_set_interrupt 00:24:11.965 ************************************ 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:24:11.965 * Looking for test storage... 00:24:11.965 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:11.965 13:46:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:24:11.965 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:24:11.965 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:11.965 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:11.965 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:24:11.965 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:11.965 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:24:11.965 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:24:11.965 13:46:59 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:24:11.966 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:24:11.966 #define SPDK_CONFIG_H 00:24:11.966 #define SPDK_CONFIG_APPS 1 00:24:11.966 #define SPDK_CONFIG_ARCH native 00:24:11.966 #undef SPDK_CONFIG_ASAN 00:24:11.966 #undef SPDK_CONFIG_AVAHI 00:24:11.966 #undef SPDK_CONFIG_CET 00:24:11.966 #define SPDK_CONFIG_COVERAGE 1 00:24:11.966 #define SPDK_CONFIG_CROSS_PREFIX 00:24:11.966 #define SPDK_CONFIG_CRYPTO 1 00:24:11.966 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:24:11.966 #undef SPDK_CONFIG_CUSTOMOCF 00:24:11.966 #undef SPDK_CONFIG_DAOS 00:24:11.966 #define SPDK_CONFIG_DAOS_DIR 00:24:11.966 #define SPDK_CONFIG_DEBUG 1 00:24:11.966 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:24:11.966 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:11.966 #define SPDK_CONFIG_DPDK_INC_DIR 00:24:11.966 #define SPDK_CONFIG_DPDK_LIB_DIR 00:24:11.966 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:24:11.966 #undef SPDK_CONFIG_DPDK_UADK 00:24:11.966 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:11.966 #define SPDK_CONFIG_EXAMPLES 1 00:24:11.966 #undef SPDK_CONFIG_FC 00:24:11.966 #define SPDK_CONFIG_FC_PATH 00:24:11.966 #define SPDK_CONFIG_FIO_PLUGIN 1 00:24:11.966 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:24:11.966 #undef SPDK_CONFIG_FUSE 00:24:11.966 #undef SPDK_CONFIG_FUZZER 00:24:11.966 #define SPDK_CONFIG_FUZZER_LIB 00:24:11.966 #undef SPDK_CONFIG_GOLANG 00:24:11.966 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:24:11.966 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:24:11.966 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:24:11.966 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:24:11.966 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:24:11.966 #undef SPDK_CONFIG_HAVE_LIBBSD 00:24:11.966 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:24:11.966 #define SPDK_CONFIG_IDXD 1 00:24:11.966 #define SPDK_CONFIG_IDXD_KERNEL 1 00:24:11.966 #define SPDK_CONFIG_IPSEC_MB 1 00:24:11.966 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:11.966 #define SPDK_CONFIG_ISAL 1 00:24:11.966 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:24:11.966 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:24:11.966 #define SPDK_CONFIG_LIBDIR 00:24:11.966 #undef SPDK_CONFIG_LTO 00:24:11.966 #define SPDK_CONFIG_MAX_LCORES 128 00:24:11.966 #define SPDK_CONFIG_NVME_CUSE 1 00:24:11.966 #undef SPDK_CONFIG_OCF 00:24:11.966 #define SPDK_CONFIG_OCF_PATH 00:24:11.966 #define SPDK_CONFIG_OPENSSL_PATH 00:24:11.966 #undef SPDK_CONFIG_PGO_CAPTURE 00:24:11.966 #define SPDK_CONFIG_PGO_DIR 00:24:11.966 #undef SPDK_CONFIG_PGO_USE 00:24:11.966 #define SPDK_CONFIG_PREFIX /usr/local 00:24:11.966 #undef SPDK_CONFIG_RAID5F 00:24:11.966 #undef SPDK_CONFIG_RBD 00:24:11.966 #define SPDK_CONFIG_RDMA 1 00:24:11.966 #define SPDK_CONFIG_RDMA_PROV verbs 00:24:11.966 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:24:11.966 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:24:11.966 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:24:11.966 #define SPDK_CONFIG_SHARED 1 00:24:11.966 #undef SPDK_CONFIG_SMA 00:24:11.966 #define SPDK_CONFIG_TESTS 1 00:24:11.966 #undef SPDK_CONFIG_TSAN 00:24:11.966 #define SPDK_CONFIG_UBLK 1 00:24:11.966 #define SPDK_CONFIG_UBSAN 1 00:24:11.966 #undef SPDK_CONFIG_UNIT_TESTS 00:24:11.966 #undef SPDK_CONFIG_URING 00:24:11.966 #define SPDK_CONFIG_URING_PATH 00:24:11.966 #undef SPDK_CONFIG_URING_ZNS 00:24:11.966 #undef SPDK_CONFIG_USDT 00:24:11.966 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:24:11.966 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:24:11.966 #undef SPDK_CONFIG_VFIO_USER 00:24:11.966 #define SPDK_CONFIG_VFIO_USER_DIR 00:24:11.966 #define SPDK_CONFIG_VHOST 1 00:24:11.966 #define SPDK_CONFIG_VIRTIO 1 00:24:11.966 #undef SPDK_CONFIG_VTUNE 00:24:11.966 #define SPDK_CONFIG_VTUNE_DIR 00:24:11.966 #define SPDK_CONFIG_WERROR 1 00:24:11.966 #define SPDK_CONFIG_WPDK_DIR 00:24:11.966 #undef SPDK_CONFIG_XNVME 00:24:11.966 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:24:11.966 13:46:59 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:24:11.966 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:11.966 13:46:59 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:11.966 13:46:59 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:11.966 13:46:59 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:11.966 13:46:59 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:11.966 13:46:59 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:11.966 13:46:59 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:11.966 13:46:59 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:24:11.966 13:46:59 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:11.966 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:24:11.966 13:46:59 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:24:12.228 13:46:59 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:24:12.228 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 107956 ]] 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 107956 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.4dBwF4 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.4dBwF4/tests/interrupt /tmp/spdk.4dBwF4 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=955527168 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4328902656 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:24:12.229 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=83625058304 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508580864 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=10883522560 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249580032 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254290432 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892238848 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901716992 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9478144 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253454848 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254290432 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=835584 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450852352 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450856448 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:24:12.230 * Looking for test storage... 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=83625058304 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=13098115072 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:12.230 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=108027 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 108027 /var/tmp/spdk.sock 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 108027 ']' 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:12.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:12.230 13:46:59 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:24:12.230 13:46:59 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:24:12.230 [2024-07-15 13:46:59.707890] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:12.230 [2024-07-15 13:46:59.707944] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid108027 ] 00:24:12.230 [2024-07-15 13:46:59.799332] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:12.503 [2024-07-15 13:46:59.899332] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:12.503 [2024-07-15 13:46:59.899354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:12.503 [2024-07-15 13:46:59.899357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:12.503 [2024-07-15 13:46:59.974629] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:13.106 13:47:00 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:13.106 13:47:00 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:24:13.106 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:24:13.106 13:47:00 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.365 Malloc0 00:24:13.365 Malloc1 00:24:13.365 Malloc2 00:24:13.365 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:24:13.365 13:47:00 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:24:13.365 13:47:00 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:24:13.365 13:47:00 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:24:13.365 5000+0 records in 00:24:13.365 5000+0 records out 00:24:13.366 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0274044 s, 374 MB/s 00:24:13.366 13:47:00 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:24:13.366 AIO0 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 108027 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 108027 without_thd 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=108027 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:13.623 13:47:00 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:13.623 13:47:01 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:24:13.881 spdk_thread ids are 1 on reactor0. 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 108027 0 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108027 0 idle 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108027 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108027 -w 256 00:24:13.881 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108027 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.34 reactor_0' 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108027 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.34 reactor_0 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 108027 1 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108027 1 idle 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108027 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108027 -w 256 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108071 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108071 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:14.139 13:47:01 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 108027 2 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108027 2 idle 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108027 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108027 -w 256 00:24:14.140 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108072 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108072 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:24:14.398 13:47:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:24:14.655 [2024-07-15 13:47:02.067992] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:14.655 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:24:14.655 [2024-07-15 13:47:02.243842] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:24:14.655 [2024-07-15 13:47:02.244071] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:14.655 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:24:14.913 [2024-07-15 13:47:02.415849] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:24:14.913 [2024-07-15 13:47:02.415972] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 108027 0 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 108027 0 busy 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108027 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108027 -w 256 00:24:14.913 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108027 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.70 reactor_0' 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108027 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.70 reactor_0 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 108027 2 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 108027 2 busy 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108027 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:15.169 13:47:02 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:15.170 13:47:02 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:15.170 13:47:02 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:15.170 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108027 -w 256 00:24:15.170 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108072 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2' 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108072 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:24:15.427 [2024-07-15 13:47:02.963832] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:24:15.427 [2024-07-15 13:47:02.963917] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 108027 2 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108027 2 idle 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108027 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:15.427 13:47:02 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108027 -w 256 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108072 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.54 reactor_2' 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108072 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.54 reactor_2 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:15.685 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:24:15.942 [2024-07-15 13:47:03.319823] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:24:15.942 [2024-07-15 13:47:03.319922] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:15.942 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:24:15.942 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:24:15.943 [2024-07-15 13:47:03.491954] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 108027 0 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108027 0 idle 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108027 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108027 -w 256 00:24:15.943 13:47:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108027 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.41 reactor_0' 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108027 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.41 reactor_0 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:24:16.201 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 108027 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 108027 ']' 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 108027 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 108027 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 108027' 00:24:16.201 killing process with pid 108027 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 108027 00:24:16.201 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 108027 00:24:16.459 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:24:16.459 13:47:03 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:24:16.459 13:47:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:24:16.459 13:47:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:16.459 13:47:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:24:16.460 13:47:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=108716 00:24:16.460 13:47:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:16.460 13:47:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:24:16.460 13:47:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 108716 /var/tmp/spdk.sock 00:24:16.460 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 108716 ']' 00:24:16.460 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:16.460 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:16.460 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:16.460 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:16.460 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:16.460 13:47:03 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:24:16.460 [2024-07-15 13:47:04.016703] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:16.460 [2024-07-15 13:47:04.016764] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid108716 ] 00:24:16.718 [2024-07-15 13:47:04.105675] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:16.718 [2024-07-15 13:47:04.196785] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:16.718 [2024-07-15 13:47:04.196861] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:16.718 [2024-07-15 13:47:04.196863] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:16.718 [2024-07-15 13:47:04.266481] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:17.285 13:47:04 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:17.285 13:47:04 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:24:17.285 13:47:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:24:17.285 13:47:04 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:17.543 Malloc0 00:24:17.543 Malloc1 00:24:17.543 Malloc2 00:24:17.543 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:24:17.543 13:47:05 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:24:17.543 13:47:05 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:24:17.543 13:47:05 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:24:17.543 5000+0 records in 00:24:17.543 5000+0 records out 00:24:17.543 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0259718 s, 394 MB/s 00:24:17.543 13:47:05 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:24:17.802 AIO0 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 108716 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 108716 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=108716 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:17.802 13:47:05 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:18.061 13:47:05 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:24:18.319 spdk_thread ids are 1 on reactor0. 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 108716 0 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108716 0 idle 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108716 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108716 -w 256 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108716 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.33 reactor_0' 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108716 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.33 reactor_0 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 108716 1 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108716 1 idle 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108716 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:18.319 13:47:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:18.320 13:47:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:18.320 13:47:05 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:18.320 13:47:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:18.320 13:47:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:18.320 13:47:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108716 -w 256 00:24:18.320 13:47:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108765 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108765 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 108716 2 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108716 2 idle 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108716 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108716 -w 256 00:24:18.578 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108766 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108766 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:24:18.836 [2024-07-15 13:47:06.373431] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:24:18.836 [2024-07-15 13:47:06.373558] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:24:18.836 [2024-07-15 13:47:06.373740] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:18.836 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:24:19.094 [2024-07-15 13:47:06.565800] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:24:19.094 [2024-07-15 13:47:06.565898] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 108716 0 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 108716 0 busy 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108716 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108716 -w 256 00:24:19.094 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108716 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.71 reactor_0' 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108716 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.71 reactor_0 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 108716 2 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 108716 2 busy 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108716 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108716 -w 256 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108766 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108766 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:19.351 13:47:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:24:19.607 [2024-07-15 13:47:07.103296] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:24:19.607 [2024-07-15 13:47:07.103404] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 108716 2 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108716 2 idle 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108716 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108716 -w 256 00:24:19.607 13:47:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108766 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.53 reactor_2' 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108766 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.53 reactor_2 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:24:19.864 [2024-07-15 13:47:07.464205] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:24:19.864 [2024-07-15 13:47:07.464435] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:24:19.864 [2024-07-15 13:47:07.464452] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 108716 0 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 108716 0 idle 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=108716 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:19.864 13:47:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:20.121 13:47:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 108716 -w 256 00:24:20.121 13:47:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:20.121 13:47:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 108716 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.42 reactor_0' 00:24:20.121 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 108716 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.42 reactor_0 00:24:20.121 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:20.121 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:24:20.122 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 108716 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 108716 ']' 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 108716 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 108716 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 108716' 00:24:20.122 killing process with pid 108716 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 108716 00:24:20.122 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 108716 00:24:20.380 13:47:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:24:20.380 13:47:07 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:24:20.380 00:24:20.380 real 0m8.527s 00:24:20.380 user 0m7.531s 00:24:20.380 sys 0m1.932s 00:24:20.380 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:20.380 13:47:07 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:24:20.380 ************************************ 00:24:20.380 END TEST reactor_set_interrupt 00:24:20.380 ************************************ 00:24:20.380 13:47:07 -- common/autotest_common.sh@1142 -- # return 0 00:24:20.380 13:47:07 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:24:20.380 13:47:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:24:20.380 13:47:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:20.380 13:47:07 -- common/autotest_common.sh@10 -- # set +x 00:24:20.380 ************************************ 00:24:20.380 START TEST reap_unregistered_poller 00:24:20.380 ************************************ 00:24:20.380 13:47:07 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:24:20.641 * Looking for test storage... 00:24:20.641 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.641 13:47:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:24:20.641 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:24:20.641 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.641 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.641 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:24:20.641 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:20.641 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:24:20.641 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:24:20.641 13:47:08 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:24:20.642 13:47:08 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:24:20.642 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:24:20.642 #define SPDK_CONFIG_H 00:24:20.642 #define SPDK_CONFIG_APPS 1 00:24:20.642 #define SPDK_CONFIG_ARCH native 00:24:20.642 #undef SPDK_CONFIG_ASAN 00:24:20.642 #undef SPDK_CONFIG_AVAHI 00:24:20.642 #undef SPDK_CONFIG_CET 00:24:20.642 #define SPDK_CONFIG_COVERAGE 1 00:24:20.642 #define SPDK_CONFIG_CROSS_PREFIX 00:24:20.642 #define SPDK_CONFIG_CRYPTO 1 00:24:20.642 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:24:20.642 #undef SPDK_CONFIG_CUSTOMOCF 00:24:20.642 #undef SPDK_CONFIG_DAOS 00:24:20.642 #define SPDK_CONFIG_DAOS_DIR 00:24:20.642 #define SPDK_CONFIG_DEBUG 1 00:24:20.642 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:24:20.642 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:20.642 #define SPDK_CONFIG_DPDK_INC_DIR 00:24:20.642 #define SPDK_CONFIG_DPDK_LIB_DIR 00:24:20.642 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:24:20.642 #undef SPDK_CONFIG_DPDK_UADK 00:24:20.642 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:20.642 #define SPDK_CONFIG_EXAMPLES 1 00:24:20.642 #undef SPDK_CONFIG_FC 00:24:20.642 #define SPDK_CONFIG_FC_PATH 00:24:20.642 #define SPDK_CONFIG_FIO_PLUGIN 1 00:24:20.642 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:24:20.642 #undef SPDK_CONFIG_FUSE 00:24:20.642 #undef SPDK_CONFIG_FUZZER 00:24:20.642 #define SPDK_CONFIG_FUZZER_LIB 00:24:20.642 #undef SPDK_CONFIG_GOLANG 00:24:20.642 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:24:20.642 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:24:20.642 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:24:20.642 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:24:20.642 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:24:20.642 #undef SPDK_CONFIG_HAVE_LIBBSD 00:24:20.642 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:24:20.642 #define SPDK_CONFIG_IDXD 1 00:24:20.642 #define SPDK_CONFIG_IDXD_KERNEL 1 00:24:20.642 #define SPDK_CONFIG_IPSEC_MB 1 00:24:20.642 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:20.642 #define SPDK_CONFIG_ISAL 1 00:24:20.642 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:24:20.642 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:24:20.642 #define SPDK_CONFIG_LIBDIR 00:24:20.642 #undef SPDK_CONFIG_LTO 00:24:20.642 #define SPDK_CONFIG_MAX_LCORES 128 00:24:20.642 #define SPDK_CONFIG_NVME_CUSE 1 00:24:20.642 #undef SPDK_CONFIG_OCF 00:24:20.642 #define SPDK_CONFIG_OCF_PATH 00:24:20.642 #define SPDK_CONFIG_OPENSSL_PATH 00:24:20.642 #undef SPDK_CONFIG_PGO_CAPTURE 00:24:20.642 #define SPDK_CONFIG_PGO_DIR 00:24:20.642 #undef SPDK_CONFIG_PGO_USE 00:24:20.642 #define SPDK_CONFIG_PREFIX /usr/local 00:24:20.642 #undef SPDK_CONFIG_RAID5F 00:24:20.642 #undef SPDK_CONFIG_RBD 00:24:20.642 #define SPDK_CONFIG_RDMA 1 00:24:20.642 #define SPDK_CONFIG_RDMA_PROV verbs 00:24:20.642 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:24:20.642 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:24:20.642 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:24:20.642 #define SPDK_CONFIG_SHARED 1 00:24:20.642 #undef SPDK_CONFIG_SMA 00:24:20.642 #define SPDK_CONFIG_TESTS 1 00:24:20.642 #undef SPDK_CONFIG_TSAN 00:24:20.642 #define SPDK_CONFIG_UBLK 1 00:24:20.642 #define SPDK_CONFIG_UBSAN 1 00:24:20.642 #undef SPDK_CONFIG_UNIT_TESTS 00:24:20.642 #undef SPDK_CONFIG_URING 00:24:20.642 #define SPDK_CONFIG_URING_PATH 00:24:20.642 #undef SPDK_CONFIG_URING_ZNS 00:24:20.642 #undef SPDK_CONFIG_USDT 00:24:20.642 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:24:20.642 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:24:20.642 #undef SPDK_CONFIG_VFIO_USER 00:24:20.642 #define SPDK_CONFIG_VFIO_USER_DIR 00:24:20.642 #define SPDK_CONFIG_VHOST 1 00:24:20.642 #define SPDK_CONFIG_VIRTIO 1 00:24:20.642 #undef SPDK_CONFIG_VTUNE 00:24:20.642 #define SPDK_CONFIG_VTUNE_DIR 00:24:20.642 #define SPDK_CONFIG_WERROR 1 00:24:20.642 #define SPDK_CONFIG_WPDK_DIR 00:24:20.642 #undef SPDK_CONFIG_XNVME 00:24:20.642 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:24:20.642 13:47:08 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:24:20.642 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:20.643 13:47:08 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:20.643 13:47:08 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:20.643 13:47:08 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:20.643 13:47:08 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:20.643 13:47:08 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:20.643 13:47:08 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:20.643 13:47:08 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:24:20.643 13:47:08 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:24:20.643 13:47:08 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:24:20.643 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:20.644 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 109367 ]] 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 109367 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.UTDIyh 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.UTDIyh/tests/interrupt /tmp/spdk.UTDIyh 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:24:20.645 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=955527168 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4328902656 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=83624902656 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508580864 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=10883678208 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249580032 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254290432 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892238848 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901716992 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9478144 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253454848 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254290432 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=835584 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450852352 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450856448 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:24:20.905 * Looking for test storage... 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=83624902656 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=13098270720 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.905 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:24:20.905 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=109437 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:20.905 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 109437 /var/tmp/spdk.sock 00:24:20.906 13:47:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:24:20.906 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 109437 ']' 00:24:20.906 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:20.906 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:20.906 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:20.906 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:20.906 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:20.906 13:47:08 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:20.906 [2024-07-15 13:47:08.328961] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:20.906 [2024-07-15 13:47:08.329026] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid109437 ] 00:24:20.906 [2024-07-15 13:47:08.416233] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:20.906 [2024-07-15 13:47:08.502294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:20.906 [2024-07-15 13:47:08.502382] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:20.906 [2024-07-15 13:47:08.502384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:21.165 [2024-07-15 13:47:08.571652] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:21.731 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:21.731 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:24:21.731 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:24:21.731 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:21.731 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:24:21.731 "name": "app_thread", 00:24:21.731 "id": 1, 00:24:21.731 "active_pollers": [], 00:24:21.731 "timed_pollers": [ 00:24:21.731 { 00:24:21.731 "name": "rpc_subsystem_poll_servers", 00:24:21.731 "id": 1, 00:24:21.731 "state": "waiting", 00:24:21.731 "run_count": 0, 00:24:21.731 "busy_count": 0, 00:24:21.731 "period_ticks": 9200000 00:24:21.731 } 00:24:21.731 ], 00:24:21.731 "paused_pollers": [] 00:24:21.731 }' 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:24:21.731 5000+0 records in 00:24:21.731 5000+0 records out 00:24:21.731 10240000 bytes (10 MB, 9.8 MiB) copied, 0.018776 s, 545 MB/s 00:24:21.731 13:47:09 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:24:21.988 AIO0 00:24:21.988 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:24:22.247 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:24:22.247 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:24:22.247 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:24:22.247 "name": "app_thread", 00:24:22.247 "id": 1, 00:24:22.247 "active_pollers": [], 00:24:22.247 "timed_pollers": [ 00:24:22.247 { 00:24:22.247 "name": "rpc_subsystem_poll_servers", 00:24:22.247 "id": 1, 00:24:22.247 "state": "waiting", 00:24:22.247 "run_count": 0, 00:24:22.247 "busy_count": 0, 00:24:22.247 "period_ticks": 9200000 00:24:22.247 } 00:24:22.247 ], 00:24:22.247 "paused_pollers": [] 00:24:22.247 }' 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:24:22.247 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:24:22.505 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:24:22.505 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:24:22.505 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:24:22.505 13:47:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 109437 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 109437 ']' 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 109437 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 109437 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 109437' 00:24:22.505 killing process with pid 109437 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 109437 00:24:22.505 13:47:09 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 109437 00:24:22.763 13:47:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:24:22.764 13:47:10 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:24:22.764 00:24:22.764 real 0m2.152s 00:24:22.764 user 0m1.227s 00:24:22.764 sys 0m0.607s 00:24:22.764 13:47:10 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:22.764 13:47:10 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:22.764 ************************************ 00:24:22.764 END TEST reap_unregistered_poller 00:24:22.764 ************************************ 00:24:22.764 13:47:10 -- common/autotest_common.sh@1142 -- # return 0 00:24:22.764 13:47:10 -- spdk/autotest.sh@198 -- # uname -s 00:24:22.764 13:47:10 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:24:22.764 13:47:10 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:24:22.764 13:47:10 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:24:22.764 13:47:10 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@260 -- # timing_exit lib 00:24:22.764 13:47:10 -- common/autotest_common.sh@728 -- # xtrace_disable 00:24:22.764 13:47:10 -- common/autotest_common.sh@10 -- # set +x 00:24:22.764 13:47:10 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:24:22.764 13:47:10 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:24:22.764 13:47:10 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:24:22.764 13:47:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:22.764 13:47:10 -- common/autotest_common.sh@10 -- # set +x 00:24:22.764 ************************************ 00:24:22.764 START TEST compress_compdev 00:24:22.764 ************************************ 00:24:22.764 13:47:10 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:24:23.022 * Looking for test storage... 00:24:23.022 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00d40ca9-2a78-e711-906e-0017a4403562 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=00d40ca9-2a78-e711-906e-0017a4403562 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:23.022 13:47:10 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:23.022 13:47:10 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:23.022 13:47:10 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:23.022 13:47:10 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:23.022 13:47:10 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:23.022 13:47:10 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:23.022 13:47:10 compress_compdev -- paths/export.sh@5 -- # export PATH 00:24:23.022 13:47:10 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:23.022 13:47:10 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=109718 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 109718 00:24:23.022 13:47:10 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 109718 ']' 00:24:23.022 13:47:10 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:23.022 13:47:10 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:23.022 13:47:10 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:23.022 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:23.022 13:47:10 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:23.022 13:47:10 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:23.022 13:47:10 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:24:23.022 [2024-07-15 13:47:10.478957] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:23.022 [2024-07-15 13:47:10.479020] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid109718 ] 00:24:23.022 [2024-07-15 13:47:10.567028] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:23.280 [2024-07-15 13:47:10.658301] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:23.280 [2024-07-15 13:47:10.658304] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:23.866 [2024-07-15 13:47:11.208078] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:23.866 13:47:11 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:23.866 13:47:11 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:23.866 13:47:11 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:24:23.866 13:47:11 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:23.866 13:47:11 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:24.429 [2024-07-15 13:47:11.786083] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x264aef0 PMD being used: compress_qat 00:24:24.429 13:47:11 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:24.429 13:47:11 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:24.429 13:47:11 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:24.429 13:47:11 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:24.429 13:47:11 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:24.429 13:47:11 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:24.429 13:47:11 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:24.429 13:47:12 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:24.686 [ 00:24:24.686 { 00:24:24.686 "name": "Nvme0n1", 00:24:24.686 "aliases": [ 00:24:24.686 "01000000-0000-0000-5cd2-e42bec7b5351" 00:24:24.686 ], 00:24:24.686 "product_name": "NVMe disk", 00:24:24.686 "block_size": 512, 00:24:24.686 "num_blocks": 7501476528, 00:24:24.686 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:24:24.686 "assigned_rate_limits": { 00:24:24.686 "rw_ios_per_sec": 0, 00:24:24.686 "rw_mbytes_per_sec": 0, 00:24:24.686 "r_mbytes_per_sec": 0, 00:24:24.686 "w_mbytes_per_sec": 0 00:24:24.686 }, 00:24:24.686 "claimed": false, 00:24:24.686 "zoned": false, 00:24:24.686 "supported_io_types": { 00:24:24.686 "read": true, 00:24:24.686 "write": true, 00:24:24.686 "unmap": true, 00:24:24.686 "flush": true, 00:24:24.686 "reset": true, 00:24:24.686 "nvme_admin": true, 00:24:24.686 "nvme_io": true, 00:24:24.686 "nvme_io_md": false, 00:24:24.686 "write_zeroes": true, 00:24:24.686 "zcopy": false, 00:24:24.686 "get_zone_info": false, 00:24:24.686 "zone_management": false, 00:24:24.686 "zone_append": false, 00:24:24.686 "compare": false, 00:24:24.686 "compare_and_write": false, 00:24:24.686 "abort": true, 00:24:24.686 "seek_hole": false, 00:24:24.686 "seek_data": false, 00:24:24.686 "copy": false, 00:24:24.686 "nvme_iov_md": false 00:24:24.686 }, 00:24:24.686 "driver_specific": { 00:24:24.686 "nvme": [ 00:24:24.686 { 00:24:24.686 "pci_address": "0000:5e:00.0", 00:24:24.686 "trid": { 00:24:24.686 "trtype": "PCIe", 00:24:24.686 "traddr": "0000:5e:00.0" 00:24:24.686 }, 00:24:24.686 "ctrlr_data": { 00:24:24.686 "cntlid": 0, 00:24:24.686 "vendor_id": "0x8086", 00:24:24.686 "model_number": "INTEL SSDPF2KX038T1", 00:24:24.686 "serial_number": "PHAX137100D13P8CGN", 00:24:24.686 "firmware_revision": "9CV10015", 00:24:24.686 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:24:24.686 "oacs": { 00:24:24.686 "security": 0, 00:24:24.686 "format": 1, 00:24:24.686 "firmware": 1, 00:24:24.686 "ns_manage": 1 00:24:24.686 }, 00:24:24.686 "multi_ctrlr": false, 00:24:24.686 "ana_reporting": false 00:24:24.686 }, 00:24:24.686 "vs": { 00:24:24.686 "nvme_version": "1.4" 00:24:24.686 }, 00:24:24.686 "ns_data": { 00:24:24.686 "id": 1, 00:24:24.686 "can_share": false 00:24:24.686 } 00:24:24.686 } 00:24:24.686 ], 00:24:24.686 "mp_policy": "active_passive" 00:24:24.686 } 00:24:24.686 } 00:24:24.686 ] 00:24:24.686 13:47:12 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:24.686 13:47:12 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:24.943 [2024-07-15 13:47:12.334524] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2499490 PMD being used: compress_qat 00:24:24.943 cce99e9a-0890-479a-b48d-0e3ccb75d5dd 00:24:24.943 13:47:12 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:24.943 df03b465-5e1f-4b66-9fa7-cbae5e1f47ee 00:24:24.943 13:47:12 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:24.943 13:47:12 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:24.943 13:47:12 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:24.943 13:47:12 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:24.943 13:47:12 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:24.943 13:47:12 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:24.943 13:47:12 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:25.206 13:47:12 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:25.464 [ 00:24:25.464 { 00:24:25.464 "name": "df03b465-5e1f-4b66-9fa7-cbae5e1f47ee", 00:24:25.464 "aliases": [ 00:24:25.464 "lvs0/lv0" 00:24:25.464 ], 00:24:25.464 "product_name": "Logical Volume", 00:24:25.464 "block_size": 512, 00:24:25.464 "num_blocks": 204800, 00:24:25.464 "uuid": "df03b465-5e1f-4b66-9fa7-cbae5e1f47ee", 00:24:25.464 "assigned_rate_limits": { 00:24:25.464 "rw_ios_per_sec": 0, 00:24:25.464 "rw_mbytes_per_sec": 0, 00:24:25.464 "r_mbytes_per_sec": 0, 00:24:25.464 "w_mbytes_per_sec": 0 00:24:25.464 }, 00:24:25.464 "claimed": false, 00:24:25.464 "zoned": false, 00:24:25.464 "supported_io_types": { 00:24:25.464 "read": true, 00:24:25.464 "write": true, 00:24:25.464 "unmap": true, 00:24:25.464 "flush": false, 00:24:25.464 "reset": true, 00:24:25.464 "nvme_admin": false, 00:24:25.464 "nvme_io": false, 00:24:25.464 "nvme_io_md": false, 00:24:25.464 "write_zeroes": true, 00:24:25.464 "zcopy": false, 00:24:25.464 "get_zone_info": false, 00:24:25.464 "zone_management": false, 00:24:25.464 "zone_append": false, 00:24:25.464 "compare": false, 00:24:25.464 "compare_and_write": false, 00:24:25.464 "abort": false, 00:24:25.464 "seek_hole": true, 00:24:25.464 "seek_data": true, 00:24:25.464 "copy": false, 00:24:25.464 "nvme_iov_md": false 00:24:25.464 }, 00:24:25.464 "driver_specific": { 00:24:25.464 "lvol": { 00:24:25.464 "lvol_store_uuid": "cce99e9a-0890-479a-b48d-0e3ccb75d5dd", 00:24:25.464 "base_bdev": "Nvme0n1", 00:24:25.464 "thin_provision": true, 00:24:25.464 "num_allocated_clusters": 0, 00:24:25.464 "snapshot": false, 00:24:25.464 "clone": false, 00:24:25.464 "esnap_clone": false 00:24:25.464 } 00:24:25.464 } 00:24:25.464 } 00:24:25.464 ] 00:24:25.464 13:47:12 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:25.464 13:47:12 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:25.464 13:47:12 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:25.464 [2024-07-15 13:47:13.077882] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:25.464 COMP_lvs0/lv0 00:24:25.777 13:47:13 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:25.777 13:47:13 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:25.777 13:47:13 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:25.777 13:47:13 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:25.777 13:47:13 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:25.777 13:47:13 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:25.777 13:47:13 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:25.777 13:47:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:26.037 [ 00:24:26.037 { 00:24:26.037 "name": "COMP_lvs0/lv0", 00:24:26.037 "aliases": [ 00:24:26.037 "de9da36c-c593-57c2-a22f-914d59ab3833" 00:24:26.037 ], 00:24:26.037 "product_name": "compress", 00:24:26.037 "block_size": 512, 00:24:26.037 "num_blocks": 200704, 00:24:26.037 "uuid": "de9da36c-c593-57c2-a22f-914d59ab3833", 00:24:26.037 "assigned_rate_limits": { 00:24:26.037 "rw_ios_per_sec": 0, 00:24:26.037 "rw_mbytes_per_sec": 0, 00:24:26.037 "r_mbytes_per_sec": 0, 00:24:26.037 "w_mbytes_per_sec": 0 00:24:26.037 }, 00:24:26.037 "claimed": false, 00:24:26.037 "zoned": false, 00:24:26.037 "supported_io_types": { 00:24:26.037 "read": true, 00:24:26.037 "write": true, 00:24:26.037 "unmap": false, 00:24:26.037 "flush": false, 00:24:26.037 "reset": false, 00:24:26.037 "nvme_admin": false, 00:24:26.037 "nvme_io": false, 00:24:26.037 "nvme_io_md": false, 00:24:26.037 "write_zeroes": true, 00:24:26.037 "zcopy": false, 00:24:26.037 "get_zone_info": false, 00:24:26.037 "zone_management": false, 00:24:26.037 "zone_append": false, 00:24:26.037 "compare": false, 00:24:26.037 "compare_and_write": false, 00:24:26.037 "abort": false, 00:24:26.037 "seek_hole": false, 00:24:26.037 "seek_data": false, 00:24:26.037 "copy": false, 00:24:26.037 "nvme_iov_md": false 00:24:26.037 }, 00:24:26.037 "driver_specific": { 00:24:26.037 "compress": { 00:24:26.037 "name": "COMP_lvs0/lv0", 00:24:26.037 "base_bdev_name": "df03b465-5e1f-4b66-9fa7-cbae5e1f47ee" 00:24:26.037 } 00:24:26.037 } 00:24:26.037 } 00:24:26.037 ] 00:24:26.037 13:47:13 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:26.037 13:47:13 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:26.037 [2024-07-15 13:47:13.567726] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f01781b15c0 PMD being used: compress_qat 00:24:26.037 [2024-07-15 13:47:13.569707] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2481530 PMD being used: compress_qat 00:24:26.037 Running I/O for 3 seconds... 00:24:29.313 00:24:29.314 Latency(us) 00:24:29.314 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:29.314 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:29.314 Verification LBA range: start 0x0 length 0x3100 00:24:29.314 COMP_lvs0/lv0 : 3.00 5329.87 20.82 0.00 0.00 5968.20 537.82 5527.82 00:24:29.314 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:29.314 Verification LBA range: start 0x3100 length 0x3100 00:24:29.314 COMP_lvs0/lv0 : 3.00 5620.72 21.96 0.00 0.00 5663.09 386.45 5328.36 00:24:29.314 =================================================================================================================== 00:24:29.314 Total : 10950.59 42.78 0.00 0.00 5811.61 386.45 5527.82 00:24:29.314 0 00:24:29.314 13:47:16 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:24:29.314 13:47:16 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:29.314 13:47:16 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:29.571 13:47:16 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:29.571 13:47:16 compress_compdev -- compress/compress.sh@78 -- # killprocess 109718 00:24:29.571 13:47:16 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 109718 ']' 00:24:29.571 13:47:16 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 109718 00:24:29.571 13:47:16 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:29.571 13:47:16 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:29.571 13:47:16 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 109718 00:24:29.571 13:47:17 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:29.571 13:47:17 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:29.571 13:47:17 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 109718' 00:24:29.571 killing process with pid 109718 00:24:29.571 13:47:17 compress_compdev -- common/autotest_common.sh@967 -- # kill 109718 00:24:29.571 Received shutdown signal, test time was about 3.000000 seconds 00:24:29.571 00:24:29.571 Latency(us) 00:24:29.571 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:29.571 =================================================================================================================== 00:24:29.571 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:29.571 13:47:17 compress_compdev -- common/autotest_common.sh@972 -- # wait 109718 00:24:31.467 13:47:18 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:24:31.467 13:47:18 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:31.467 13:47:18 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=110801 00:24:31.467 13:47:18 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:31.467 13:47:18 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:24:31.467 13:47:18 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 110801 00:24:31.467 13:47:18 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 110801 ']' 00:24:31.467 13:47:18 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:31.467 13:47:18 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:31.467 13:47:18 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:31.467 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:31.467 13:47:18 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:31.467 13:47:18 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:31.467 [2024-07-15 13:47:18.707039] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:31.467 [2024-07-15 13:47:18.707101] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid110801 ] 00:24:31.467 [2024-07-15 13:47:18.794023] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:31.467 [2024-07-15 13:47:18.882261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:31.467 [2024-07-15 13:47:18.882265] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:32.031 [2024-07-15 13:47:19.429515] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:32.031 13:47:19 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:32.031 13:47:19 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:32.031 13:47:19 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:24:32.031 13:47:19 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:32.031 13:47:19 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:32.595 [2024-07-15 13:47:19.988307] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c82ef0 PMD being used: compress_qat 00:24:32.595 13:47:20 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:32.595 13:47:20 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:32.595 13:47:20 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:32.595 13:47:20 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:32.595 13:47:20 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:32.595 13:47:20 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:32.595 13:47:20 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:32.595 13:47:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:32.853 [ 00:24:32.853 { 00:24:32.853 "name": "Nvme0n1", 00:24:32.853 "aliases": [ 00:24:32.853 "01000000-0000-0000-5cd2-e42bec7b5351" 00:24:32.853 ], 00:24:32.853 "product_name": "NVMe disk", 00:24:32.853 "block_size": 512, 00:24:32.853 "num_blocks": 7501476528, 00:24:32.853 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:24:32.853 "assigned_rate_limits": { 00:24:32.853 "rw_ios_per_sec": 0, 00:24:32.853 "rw_mbytes_per_sec": 0, 00:24:32.853 "r_mbytes_per_sec": 0, 00:24:32.853 "w_mbytes_per_sec": 0 00:24:32.853 }, 00:24:32.853 "claimed": false, 00:24:32.853 "zoned": false, 00:24:32.853 "supported_io_types": { 00:24:32.853 "read": true, 00:24:32.853 "write": true, 00:24:32.853 "unmap": true, 00:24:32.853 "flush": true, 00:24:32.853 "reset": true, 00:24:32.853 "nvme_admin": true, 00:24:32.853 "nvme_io": true, 00:24:32.853 "nvme_io_md": false, 00:24:32.853 "write_zeroes": true, 00:24:32.853 "zcopy": false, 00:24:32.853 "get_zone_info": false, 00:24:32.853 "zone_management": false, 00:24:32.853 "zone_append": false, 00:24:32.853 "compare": false, 00:24:32.853 "compare_and_write": false, 00:24:32.853 "abort": true, 00:24:32.853 "seek_hole": false, 00:24:32.853 "seek_data": false, 00:24:32.853 "copy": false, 00:24:32.853 "nvme_iov_md": false 00:24:32.853 }, 00:24:32.853 "driver_specific": { 00:24:32.853 "nvme": [ 00:24:32.853 { 00:24:32.853 "pci_address": "0000:5e:00.0", 00:24:32.853 "trid": { 00:24:32.853 "trtype": "PCIe", 00:24:32.853 "traddr": "0000:5e:00.0" 00:24:32.853 }, 00:24:32.853 "ctrlr_data": { 00:24:32.853 "cntlid": 0, 00:24:32.853 "vendor_id": "0x8086", 00:24:32.853 "model_number": "INTEL SSDPF2KX038T1", 00:24:32.853 "serial_number": "PHAX137100D13P8CGN", 00:24:32.853 "firmware_revision": "9CV10015", 00:24:32.853 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:24:32.853 "oacs": { 00:24:32.853 "security": 0, 00:24:32.853 "format": 1, 00:24:32.853 "firmware": 1, 00:24:32.853 "ns_manage": 1 00:24:32.853 }, 00:24:32.853 "multi_ctrlr": false, 00:24:32.853 "ana_reporting": false 00:24:32.853 }, 00:24:32.853 "vs": { 00:24:32.853 "nvme_version": "1.4" 00:24:32.853 }, 00:24:32.853 "ns_data": { 00:24:32.853 "id": 1, 00:24:32.853 "can_share": false 00:24:32.853 } 00:24:32.853 } 00:24:32.853 ], 00:24:32.853 "mp_policy": "active_passive" 00:24:32.853 } 00:24:32.853 } 00:24:32.853 ] 00:24:32.853 13:47:20 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:32.853 13:47:20 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:33.111 [2024-07-15 13:47:20.548645] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ad1380 PMD being used: compress_qat 00:24:33.111 f9cb231e-a578-422c-a1dd-2be7eec6c29a 00:24:33.111 13:47:20 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:33.111 79cda785-d97e-4fa7-9551-db49ae82f289 00:24:33.369 13:47:20 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:33.369 13:47:20 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:33.369 13:47:20 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:33.369 13:47:20 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:33.369 13:47:20 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:33.369 13:47:20 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:33.369 13:47:20 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:33.369 13:47:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:33.626 [ 00:24:33.626 { 00:24:33.626 "name": "79cda785-d97e-4fa7-9551-db49ae82f289", 00:24:33.626 "aliases": [ 00:24:33.626 "lvs0/lv0" 00:24:33.626 ], 00:24:33.626 "product_name": "Logical Volume", 00:24:33.626 "block_size": 512, 00:24:33.626 "num_blocks": 204800, 00:24:33.626 "uuid": "79cda785-d97e-4fa7-9551-db49ae82f289", 00:24:33.626 "assigned_rate_limits": { 00:24:33.626 "rw_ios_per_sec": 0, 00:24:33.626 "rw_mbytes_per_sec": 0, 00:24:33.626 "r_mbytes_per_sec": 0, 00:24:33.626 "w_mbytes_per_sec": 0 00:24:33.626 }, 00:24:33.626 "claimed": false, 00:24:33.626 "zoned": false, 00:24:33.626 "supported_io_types": { 00:24:33.626 "read": true, 00:24:33.626 "write": true, 00:24:33.626 "unmap": true, 00:24:33.626 "flush": false, 00:24:33.626 "reset": true, 00:24:33.626 "nvme_admin": false, 00:24:33.626 "nvme_io": false, 00:24:33.626 "nvme_io_md": false, 00:24:33.626 "write_zeroes": true, 00:24:33.626 "zcopy": false, 00:24:33.626 "get_zone_info": false, 00:24:33.626 "zone_management": false, 00:24:33.626 "zone_append": false, 00:24:33.626 "compare": false, 00:24:33.626 "compare_and_write": false, 00:24:33.626 "abort": false, 00:24:33.626 "seek_hole": true, 00:24:33.626 "seek_data": true, 00:24:33.626 "copy": false, 00:24:33.626 "nvme_iov_md": false 00:24:33.626 }, 00:24:33.626 "driver_specific": { 00:24:33.626 "lvol": { 00:24:33.626 "lvol_store_uuid": "f9cb231e-a578-422c-a1dd-2be7eec6c29a", 00:24:33.626 "base_bdev": "Nvme0n1", 00:24:33.626 "thin_provision": true, 00:24:33.626 "num_allocated_clusters": 0, 00:24:33.626 "snapshot": false, 00:24:33.626 "clone": false, 00:24:33.626 "esnap_clone": false 00:24:33.626 } 00:24:33.626 } 00:24:33.626 } 00:24:33.626 ] 00:24:33.626 13:47:21 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:33.626 13:47:21 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:24:33.626 13:47:21 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:24:33.883 [2024-07-15 13:47:21.247614] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:33.883 COMP_lvs0/lv0 00:24:33.883 13:47:21 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:33.883 13:47:21 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:33.883 13:47:21 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:33.883 13:47:21 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:33.883 13:47:21 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:33.883 13:47:21 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:33.883 13:47:21 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:33.883 13:47:21 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:34.140 [ 00:24:34.140 { 00:24:34.140 "name": "COMP_lvs0/lv0", 00:24:34.140 "aliases": [ 00:24:34.140 "b2cdb50e-eabf-5ee0-9695-9cc2ed6a6b96" 00:24:34.140 ], 00:24:34.140 "product_name": "compress", 00:24:34.140 "block_size": 512, 00:24:34.140 "num_blocks": 200704, 00:24:34.140 "uuid": "b2cdb50e-eabf-5ee0-9695-9cc2ed6a6b96", 00:24:34.140 "assigned_rate_limits": { 00:24:34.140 "rw_ios_per_sec": 0, 00:24:34.140 "rw_mbytes_per_sec": 0, 00:24:34.140 "r_mbytes_per_sec": 0, 00:24:34.140 "w_mbytes_per_sec": 0 00:24:34.140 }, 00:24:34.140 "claimed": false, 00:24:34.140 "zoned": false, 00:24:34.140 "supported_io_types": { 00:24:34.140 "read": true, 00:24:34.140 "write": true, 00:24:34.140 "unmap": false, 00:24:34.140 "flush": false, 00:24:34.140 "reset": false, 00:24:34.140 "nvme_admin": false, 00:24:34.140 "nvme_io": false, 00:24:34.140 "nvme_io_md": false, 00:24:34.140 "write_zeroes": true, 00:24:34.140 "zcopy": false, 00:24:34.140 "get_zone_info": false, 00:24:34.140 "zone_management": false, 00:24:34.140 "zone_append": false, 00:24:34.140 "compare": false, 00:24:34.140 "compare_and_write": false, 00:24:34.140 "abort": false, 00:24:34.140 "seek_hole": false, 00:24:34.140 "seek_data": false, 00:24:34.140 "copy": false, 00:24:34.140 "nvme_iov_md": false 00:24:34.140 }, 00:24:34.140 "driver_specific": { 00:24:34.140 "compress": { 00:24:34.140 "name": "COMP_lvs0/lv0", 00:24:34.140 "base_bdev_name": "79cda785-d97e-4fa7-9551-db49ae82f289" 00:24:34.140 } 00:24:34.140 } 00:24:34.140 } 00:24:34.140 ] 00:24:34.140 13:47:21 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:34.140 13:47:21 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:34.140 [2024-07-15 13:47:21.705591] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fdef41b15c0 PMD being used: compress_qat 00:24:34.140 [2024-07-15 13:47:21.707321] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ab9540 PMD being used: compress_qat 00:24:34.140 Running I/O for 3 seconds... 00:24:37.412 00:24:37.412 Latency(us) 00:24:37.412 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:37.412 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:37.412 Verification LBA range: start 0x0 length 0x3100 00:24:37.412 COMP_lvs0/lv0 : 3.00 5343.76 20.87 0.00 0.00 5954.14 406.04 5841.25 00:24:37.412 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:37.412 Verification LBA range: start 0x3100 length 0x3100 00:24:37.412 COMP_lvs0/lv0 : 3.00 5632.70 22.00 0.00 0.00 5652.00 320.56 5556.31 00:24:37.412 =================================================================================================================== 00:24:37.412 Total : 10976.46 42.88 0.00 0.00 5799.11 320.56 5841.25 00:24:37.412 0 00:24:37.413 13:47:24 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:24:37.413 13:47:24 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:37.413 13:47:24 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:37.670 13:47:25 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:37.670 13:47:25 compress_compdev -- compress/compress.sh@78 -- # killprocess 110801 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 110801 ']' 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 110801 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 110801 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 110801' 00:24:37.670 killing process with pid 110801 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@967 -- # kill 110801 00:24:37.670 Received shutdown signal, test time was about 3.000000 seconds 00:24:37.670 00:24:37.670 Latency(us) 00:24:37.670 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:37.670 =================================================================================================================== 00:24:37.670 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:37.670 13:47:25 compress_compdev -- common/autotest_common.sh@972 -- # wait 110801 00:24:39.630 13:47:26 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:24:39.630 13:47:26 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:39.630 13:47:26 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=111884 00:24:39.630 13:47:26 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:39.631 13:47:26 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 111884 00:24:39.631 13:47:26 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:24:39.631 13:47:26 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 111884 ']' 00:24:39.631 13:47:26 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:39.631 13:47:26 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:39.631 13:47:26 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:39.631 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:39.631 13:47:26 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:39.631 13:47:26 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:39.631 [2024-07-15 13:47:26.793513] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:39.631 [2024-07-15 13:47:26.793567] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid111884 ] 00:24:39.631 [2024-07-15 13:47:26.875654] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:39.631 [2024-07-15 13:47:26.956339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:39.631 [2024-07-15 13:47:26.956342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:39.888 [2024-07-15 13:47:27.505222] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:40.145 13:47:27 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:40.145 13:47:27 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:40.145 13:47:27 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:24:40.145 13:47:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:40.146 13:47:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:40.711 [2024-07-15 13:47:28.090034] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28b7ef0 PMD being used: compress_qat 00:24:40.711 13:47:28 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:40.711 13:47:28 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:40.711 13:47:28 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:40.711 13:47:28 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:40.711 13:47:28 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:40.711 13:47:28 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:40.711 13:47:28 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:40.711 13:47:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:40.969 [ 00:24:40.969 { 00:24:40.969 "name": "Nvme0n1", 00:24:40.969 "aliases": [ 00:24:40.969 "01000000-0000-0000-5cd2-e42bec7b5351" 00:24:40.969 ], 00:24:40.969 "product_name": "NVMe disk", 00:24:40.969 "block_size": 512, 00:24:40.969 "num_blocks": 7501476528, 00:24:40.969 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:24:40.969 "assigned_rate_limits": { 00:24:40.969 "rw_ios_per_sec": 0, 00:24:40.969 "rw_mbytes_per_sec": 0, 00:24:40.969 "r_mbytes_per_sec": 0, 00:24:40.969 "w_mbytes_per_sec": 0 00:24:40.969 }, 00:24:40.969 "claimed": false, 00:24:40.969 "zoned": false, 00:24:40.969 "supported_io_types": { 00:24:40.969 "read": true, 00:24:40.969 "write": true, 00:24:40.969 "unmap": true, 00:24:40.969 "flush": true, 00:24:40.969 "reset": true, 00:24:40.969 "nvme_admin": true, 00:24:40.969 "nvme_io": true, 00:24:40.969 "nvme_io_md": false, 00:24:40.969 "write_zeroes": true, 00:24:40.969 "zcopy": false, 00:24:40.969 "get_zone_info": false, 00:24:40.969 "zone_management": false, 00:24:40.969 "zone_append": false, 00:24:40.969 "compare": false, 00:24:40.969 "compare_and_write": false, 00:24:40.969 "abort": true, 00:24:40.969 "seek_hole": false, 00:24:40.969 "seek_data": false, 00:24:40.969 "copy": false, 00:24:40.969 "nvme_iov_md": false 00:24:40.969 }, 00:24:40.969 "driver_specific": { 00:24:40.969 "nvme": [ 00:24:40.969 { 00:24:40.969 "pci_address": "0000:5e:00.0", 00:24:40.969 "trid": { 00:24:40.969 "trtype": "PCIe", 00:24:40.969 "traddr": "0000:5e:00.0" 00:24:40.969 }, 00:24:40.969 "ctrlr_data": { 00:24:40.969 "cntlid": 0, 00:24:40.969 "vendor_id": "0x8086", 00:24:40.969 "model_number": "INTEL SSDPF2KX038T1", 00:24:40.969 "serial_number": "PHAX137100D13P8CGN", 00:24:40.969 "firmware_revision": "9CV10015", 00:24:40.969 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:24:40.969 "oacs": { 00:24:40.969 "security": 0, 00:24:40.969 "format": 1, 00:24:40.969 "firmware": 1, 00:24:40.969 "ns_manage": 1 00:24:40.969 }, 00:24:40.970 "multi_ctrlr": false, 00:24:40.970 "ana_reporting": false 00:24:40.970 }, 00:24:40.970 "vs": { 00:24:40.970 "nvme_version": "1.4" 00:24:40.970 }, 00:24:40.970 "ns_data": { 00:24:40.970 "id": 1, 00:24:40.970 "can_share": false 00:24:40.970 } 00:24:40.970 } 00:24:40.970 ], 00:24:40.970 "mp_policy": "active_passive" 00:24:40.970 } 00:24:40.970 } 00:24:40.970 ] 00:24:40.970 13:47:28 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:40.970 13:47:28 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:41.227 [2024-07-15 13:47:28.638018] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2706380 PMD being used: compress_qat 00:24:41.227 74f2640e-e112-4da9-833d-f5ff8ede23c7 00:24:41.227 13:47:28 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:41.227 8cb8eccb-e6ab-4d5d-95e5-8892dccd507f 00:24:41.227 13:47:28 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:41.227 13:47:28 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:41.227 13:47:28 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:41.227 13:47:28 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:41.227 13:47:28 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:41.227 13:47:28 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:41.227 13:47:28 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:41.485 13:47:29 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:41.743 [ 00:24:41.743 { 00:24:41.743 "name": "8cb8eccb-e6ab-4d5d-95e5-8892dccd507f", 00:24:41.743 "aliases": [ 00:24:41.743 "lvs0/lv0" 00:24:41.743 ], 00:24:41.743 "product_name": "Logical Volume", 00:24:41.743 "block_size": 512, 00:24:41.743 "num_blocks": 204800, 00:24:41.743 "uuid": "8cb8eccb-e6ab-4d5d-95e5-8892dccd507f", 00:24:41.743 "assigned_rate_limits": { 00:24:41.743 "rw_ios_per_sec": 0, 00:24:41.743 "rw_mbytes_per_sec": 0, 00:24:41.743 "r_mbytes_per_sec": 0, 00:24:41.743 "w_mbytes_per_sec": 0 00:24:41.743 }, 00:24:41.743 "claimed": false, 00:24:41.743 "zoned": false, 00:24:41.743 "supported_io_types": { 00:24:41.743 "read": true, 00:24:41.743 "write": true, 00:24:41.743 "unmap": true, 00:24:41.743 "flush": false, 00:24:41.743 "reset": true, 00:24:41.743 "nvme_admin": false, 00:24:41.743 "nvme_io": false, 00:24:41.743 "nvme_io_md": false, 00:24:41.743 "write_zeroes": true, 00:24:41.743 "zcopy": false, 00:24:41.743 "get_zone_info": false, 00:24:41.743 "zone_management": false, 00:24:41.743 "zone_append": false, 00:24:41.743 "compare": false, 00:24:41.743 "compare_and_write": false, 00:24:41.743 "abort": false, 00:24:41.743 "seek_hole": true, 00:24:41.743 "seek_data": true, 00:24:41.743 "copy": false, 00:24:41.743 "nvme_iov_md": false 00:24:41.743 }, 00:24:41.743 "driver_specific": { 00:24:41.743 "lvol": { 00:24:41.743 "lvol_store_uuid": "74f2640e-e112-4da9-833d-f5ff8ede23c7", 00:24:41.743 "base_bdev": "Nvme0n1", 00:24:41.743 "thin_provision": true, 00:24:41.743 "num_allocated_clusters": 0, 00:24:41.743 "snapshot": false, 00:24:41.743 "clone": false, 00:24:41.743 "esnap_clone": false 00:24:41.743 } 00:24:41.743 } 00:24:41.743 } 00:24:41.743 ] 00:24:41.743 13:47:29 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:41.743 13:47:29 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:24:41.743 13:47:29 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:24:41.743 [2024-07-15 13:47:29.357174] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:41.743 COMP_lvs0/lv0 00:24:42.001 13:47:29 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:42.001 13:47:29 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:42.001 13:47:29 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:42.001 13:47:29 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:42.001 13:47:29 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:42.001 13:47:29 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:42.001 13:47:29 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:42.001 13:47:29 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:42.259 [ 00:24:42.259 { 00:24:42.259 "name": "COMP_lvs0/lv0", 00:24:42.259 "aliases": [ 00:24:42.259 "558e82fd-6012-5b50-b8c4-7985bb4de09f" 00:24:42.259 ], 00:24:42.259 "product_name": "compress", 00:24:42.259 "block_size": 4096, 00:24:42.259 "num_blocks": 25088, 00:24:42.259 "uuid": "558e82fd-6012-5b50-b8c4-7985bb4de09f", 00:24:42.259 "assigned_rate_limits": { 00:24:42.259 "rw_ios_per_sec": 0, 00:24:42.259 "rw_mbytes_per_sec": 0, 00:24:42.259 "r_mbytes_per_sec": 0, 00:24:42.259 "w_mbytes_per_sec": 0 00:24:42.259 }, 00:24:42.259 "claimed": false, 00:24:42.259 "zoned": false, 00:24:42.259 "supported_io_types": { 00:24:42.259 "read": true, 00:24:42.259 "write": true, 00:24:42.259 "unmap": false, 00:24:42.259 "flush": false, 00:24:42.259 "reset": false, 00:24:42.259 "nvme_admin": false, 00:24:42.259 "nvme_io": false, 00:24:42.259 "nvme_io_md": false, 00:24:42.259 "write_zeroes": true, 00:24:42.259 "zcopy": false, 00:24:42.259 "get_zone_info": false, 00:24:42.259 "zone_management": false, 00:24:42.259 "zone_append": false, 00:24:42.259 "compare": false, 00:24:42.259 "compare_and_write": false, 00:24:42.259 "abort": false, 00:24:42.259 "seek_hole": false, 00:24:42.259 "seek_data": false, 00:24:42.259 "copy": false, 00:24:42.259 "nvme_iov_md": false 00:24:42.259 }, 00:24:42.259 "driver_specific": { 00:24:42.259 "compress": { 00:24:42.259 "name": "COMP_lvs0/lv0", 00:24:42.259 "base_bdev_name": "8cb8eccb-e6ab-4d5d-95e5-8892dccd507f" 00:24:42.259 } 00:24:42.259 } 00:24:42.259 } 00:24:42.259 ] 00:24:42.259 13:47:29 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:42.259 13:47:29 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:42.259 [2024-07-15 13:47:29.815271] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f75541b15c0 PMD being used: compress_qat 00:24:42.259 [2024-07-15 13:47:29.817000] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26ee540 PMD being used: compress_qat 00:24:42.259 Running I/O for 3 seconds... 00:24:45.539 00:24:45.539 Latency(us) 00:24:45.539 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:45.539 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:45.539 Verification LBA range: start 0x0 length 0x3100 00:24:45.539 COMP_lvs0/lv0 : 3.00 5311.98 20.75 0.00 0.00 5988.85 441.66 5613.30 00:24:45.539 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:45.539 Verification LBA range: start 0x3100 length 0x3100 00:24:45.539 COMP_lvs0/lv0 : 3.00 5554.76 21.70 0.00 0.00 5730.23 381.11 5214.39 00:24:45.539 =================================================================================================================== 00:24:45.539 Total : 10866.74 42.45 0.00 0.00 5856.66 381.11 5613.30 00:24:45.539 0 00:24:45.539 13:47:32 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:24:45.539 13:47:32 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:45.539 13:47:33 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:45.798 13:47:33 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:45.798 13:47:33 compress_compdev -- compress/compress.sh@78 -- # killprocess 111884 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 111884 ']' 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 111884 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 111884 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 111884' 00:24:45.798 killing process with pid 111884 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@967 -- # kill 111884 00:24:45.798 Received shutdown signal, test time was about 3.000000 seconds 00:24:45.798 00:24:45.798 Latency(us) 00:24:45.798 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:45.798 =================================================================================================================== 00:24:45.798 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:45.798 13:47:33 compress_compdev -- common/autotest_common.sh@972 -- # wait 111884 00:24:47.705 13:47:34 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:24:47.705 13:47:34 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:47.705 13:47:34 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=112964 00:24:47.705 13:47:34 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:47.705 13:47:34 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:24:47.705 13:47:34 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 112964 00:24:47.705 13:47:34 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 112964 ']' 00:24:47.705 13:47:34 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:47.705 13:47:34 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:47.705 13:47:34 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:47.705 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:47.705 13:47:34 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:47.705 13:47:34 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:47.705 [2024-07-15 13:47:34.946785] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:47.705 [2024-07-15 13:47:34.946850] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid112964 ] 00:24:47.705 [2024-07-15 13:47:35.034368] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:47.705 [2024-07-15 13:47:35.120812] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:47.705 [2024-07-15 13:47:35.120899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:47.705 [2024-07-15 13:47:35.120901] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:48.271 [2024-07-15 13:47:35.682588] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:48.271 13:47:35 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:48.271 13:47:35 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:48.271 13:47:35 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:24:48.271 13:47:35 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:48.271 13:47:35 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:48.835 [2024-07-15 13:47:36.250357] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c91a00 PMD being used: compress_qat 00:24:48.835 13:47:36 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:48.835 13:47:36 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:48.835 13:47:36 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:48.835 13:47:36 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:48.835 13:47:36 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:48.835 13:47:36 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:48.835 13:47:36 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:49.092 13:47:36 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:49.092 [ 00:24:49.092 { 00:24:49.092 "name": "Nvme0n1", 00:24:49.092 "aliases": [ 00:24:49.092 "01000000-0000-0000-5cd2-e42bec7b5351" 00:24:49.092 ], 00:24:49.092 "product_name": "NVMe disk", 00:24:49.092 "block_size": 512, 00:24:49.092 "num_blocks": 7501476528, 00:24:49.092 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:24:49.092 "assigned_rate_limits": { 00:24:49.092 "rw_ios_per_sec": 0, 00:24:49.092 "rw_mbytes_per_sec": 0, 00:24:49.092 "r_mbytes_per_sec": 0, 00:24:49.092 "w_mbytes_per_sec": 0 00:24:49.092 }, 00:24:49.092 "claimed": false, 00:24:49.092 "zoned": false, 00:24:49.092 "supported_io_types": { 00:24:49.092 "read": true, 00:24:49.092 "write": true, 00:24:49.093 "unmap": true, 00:24:49.093 "flush": true, 00:24:49.093 "reset": true, 00:24:49.093 "nvme_admin": true, 00:24:49.093 "nvme_io": true, 00:24:49.093 "nvme_io_md": false, 00:24:49.093 "write_zeroes": true, 00:24:49.093 "zcopy": false, 00:24:49.093 "get_zone_info": false, 00:24:49.093 "zone_management": false, 00:24:49.093 "zone_append": false, 00:24:49.093 "compare": false, 00:24:49.093 "compare_and_write": false, 00:24:49.093 "abort": true, 00:24:49.093 "seek_hole": false, 00:24:49.093 "seek_data": false, 00:24:49.093 "copy": false, 00:24:49.093 "nvme_iov_md": false 00:24:49.093 }, 00:24:49.093 "driver_specific": { 00:24:49.093 "nvme": [ 00:24:49.093 { 00:24:49.093 "pci_address": "0000:5e:00.0", 00:24:49.093 "trid": { 00:24:49.093 "trtype": "PCIe", 00:24:49.093 "traddr": "0000:5e:00.0" 00:24:49.093 }, 00:24:49.093 "ctrlr_data": { 00:24:49.093 "cntlid": 0, 00:24:49.093 "vendor_id": "0x8086", 00:24:49.093 "model_number": "INTEL SSDPF2KX038T1", 00:24:49.093 "serial_number": "PHAX137100D13P8CGN", 00:24:49.093 "firmware_revision": "9CV10015", 00:24:49.093 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:24:49.093 "oacs": { 00:24:49.093 "security": 0, 00:24:49.093 "format": 1, 00:24:49.093 "firmware": 1, 00:24:49.093 "ns_manage": 1 00:24:49.093 }, 00:24:49.093 "multi_ctrlr": false, 00:24:49.093 "ana_reporting": false 00:24:49.093 }, 00:24:49.093 "vs": { 00:24:49.093 "nvme_version": "1.4" 00:24:49.093 }, 00:24:49.093 "ns_data": { 00:24:49.093 "id": 1, 00:24:49.093 "can_share": false 00:24:49.093 } 00:24:49.093 } 00:24:49.093 ], 00:24:49.093 "mp_policy": "active_passive" 00:24:49.093 } 00:24:49.093 } 00:24:49.093 ] 00:24:49.093 13:47:36 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:49.093 13:47:36 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:49.350 [2024-07-15 13:47:36.798715] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1adfe20 PMD being used: compress_qat 00:24:49.350 9ecb3828-5f9b-40b4-a3d8-96dbd5969714 00:24:49.350 13:47:36 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:49.606 7648351c-5827-43a8-be81-f6381a2be9c9 00:24:49.606 13:47:37 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:49.606 13:47:37 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:49.606 13:47:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:49.606 13:47:37 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:49.606 13:47:37 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:49.606 13:47:37 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:49.606 13:47:37 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:49.606 13:47:37 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:49.863 [ 00:24:49.863 { 00:24:49.863 "name": "7648351c-5827-43a8-be81-f6381a2be9c9", 00:24:49.863 "aliases": [ 00:24:49.863 "lvs0/lv0" 00:24:49.863 ], 00:24:49.863 "product_name": "Logical Volume", 00:24:49.863 "block_size": 512, 00:24:49.863 "num_blocks": 204800, 00:24:49.863 "uuid": "7648351c-5827-43a8-be81-f6381a2be9c9", 00:24:49.863 "assigned_rate_limits": { 00:24:49.863 "rw_ios_per_sec": 0, 00:24:49.863 "rw_mbytes_per_sec": 0, 00:24:49.863 "r_mbytes_per_sec": 0, 00:24:49.863 "w_mbytes_per_sec": 0 00:24:49.863 }, 00:24:49.863 "claimed": false, 00:24:49.863 "zoned": false, 00:24:49.863 "supported_io_types": { 00:24:49.863 "read": true, 00:24:49.863 "write": true, 00:24:49.863 "unmap": true, 00:24:49.863 "flush": false, 00:24:49.863 "reset": true, 00:24:49.863 "nvme_admin": false, 00:24:49.863 "nvme_io": false, 00:24:49.863 "nvme_io_md": false, 00:24:49.863 "write_zeroes": true, 00:24:49.863 "zcopy": false, 00:24:49.863 "get_zone_info": false, 00:24:49.863 "zone_management": false, 00:24:49.863 "zone_append": false, 00:24:49.863 "compare": false, 00:24:49.863 "compare_and_write": false, 00:24:49.863 "abort": false, 00:24:49.863 "seek_hole": true, 00:24:49.863 "seek_data": true, 00:24:49.863 "copy": false, 00:24:49.863 "nvme_iov_md": false 00:24:49.863 }, 00:24:49.863 "driver_specific": { 00:24:49.863 "lvol": { 00:24:49.863 "lvol_store_uuid": "9ecb3828-5f9b-40b4-a3d8-96dbd5969714", 00:24:49.863 "base_bdev": "Nvme0n1", 00:24:49.864 "thin_provision": true, 00:24:49.864 "num_allocated_clusters": 0, 00:24:49.864 "snapshot": false, 00:24:49.864 "clone": false, 00:24:49.864 "esnap_clone": false 00:24:49.864 } 00:24:49.864 } 00:24:49.864 } 00:24:49.864 ] 00:24:49.864 13:47:37 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:49.864 13:47:37 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:49.864 13:47:37 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:50.121 [2024-07-15 13:47:37.537975] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:50.121 COMP_lvs0/lv0 00:24:50.121 13:47:37 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:50.121 13:47:37 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:50.121 13:47:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:50.121 13:47:37 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:50.121 13:47:37 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:50.121 13:47:37 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:50.121 13:47:37 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:50.121 13:47:37 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:50.379 [ 00:24:50.379 { 00:24:50.379 "name": "COMP_lvs0/lv0", 00:24:50.379 "aliases": [ 00:24:50.379 "9c04eee0-b771-5560-af16-7b9a4dbab790" 00:24:50.379 ], 00:24:50.379 "product_name": "compress", 00:24:50.379 "block_size": 512, 00:24:50.379 "num_blocks": 200704, 00:24:50.379 "uuid": "9c04eee0-b771-5560-af16-7b9a4dbab790", 00:24:50.379 "assigned_rate_limits": { 00:24:50.379 "rw_ios_per_sec": 0, 00:24:50.379 "rw_mbytes_per_sec": 0, 00:24:50.379 "r_mbytes_per_sec": 0, 00:24:50.379 "w_mbytes_per_sec": 0 00:24:50.379 }, 00:24:50.379 "claimed": false, 00:24:50.379 "zoned": false, 00:24:50.379 "supported_io_types": { 00:24:50.379 "read": true, 00:24:50.379 "write": true, 00:24:50.379 "unmap": false, 00:24:50.379 "flush": false, 00:24:50.379 "reset": false, 00:24:50.379 "nvme_admin": false, 00:24:50.379 "nvme_io": false, 00:24:50.379 "nvme_io_md": false, 00:24:50.379 "write_zeroes": true, 00:24:50.379 "zcopy": false, 00:24:50.379 "get_zone_info": false, 00:24:50.379 "zone_management": false, 00:24:50.379 "zone_append": false, 00:24:50.379 "compare": false, 00:24:50.379 "compare_and_write": false, 00:24:50.379 "abort": false, 00:24:50.379 "seek_hole": false, 00:24:50.379 "seek_data": false, 00:24:50.379 "copy": false, 00:24:50.379 "nvme_iov_md": false 00:24:50.379 }, 00:24:50.379 "driver_specific": { 00:24:50.379 "compress": { 00:24:50.379 "name": "COMP_lvs0/lv0", 00:24:50.379 "base_bdev_name": "7648351c-5827-43a8-be81-f6381a2be9c9" 00:24:50.379 } 00:24:50.379 } 00:24:50.379 } 00:24:50.379 ] 00:24:50.379 13:47:37 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:50.379 13:47:37 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:24:50.379 [2024-07-15 13:47:37.974838] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb9e41b1350 PMD being used: compress_qat 00:24:50.379 I/O targets: 00:24:50.379 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:24:50.379 00:24:50.379 00:24:50.379 CUnit - A unit testing framework for C - Version 2.1-3 00:24:50.379 http://cunit.sourceforge.net/ 00:24:50.379 00:24:50.379 00:24:50.379 Suite: bdevio tests on: COMP_lvs0/lv0 00:24:50.379 Test: blockdev write read block ...passed 00:24:50.379 Test: blockdev write zeroes read block ...passed 00:24:50.379 Test: blockdev write zeroes read no split ...passed 00:24:50.379 Test: blockdev write zeroes read split ...passed 00:24:50.636 Test: blockdev write zeroes read split partial ...passed 00:24:50.636 Test: blockdev reset ...[2024-07-15 13:47:38.012455] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:24:50.636 passed 00:24:50.636 Test: blockdev write read 8 blocks ...passed 00:24:50.636 Test: blockdev write read size > 128k ...passed 00:24:50.636 Test: blockdev write read invalid size ...passed 00:24:50.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:50.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:50.636 Test: blockdev write read max offset ...passed 00:24:50.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:50.636 Test: blockdev writev readv 8 blocks ...passed 00:24:50.636 Test: blockdev writev readv 30 x 1block ...passed 00:24:50.636 Test: blockdev writev readv block ...passed 00:24:50.636 Test: blockdev writev readv size > 128k ...passed 00:24:50.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:50.636 Test: blockdev comparev and writev ...passed 00:24:50.636 Test: blockdev nvme passthru rw ...passed 00:24:50.636 Test: blockdev nvme passthru vendor specific ...passed 00:24:50.636 Test: blockdev nvme admin passthru ...passed 00:24:50.636 Test: blockdev copy ...passed 00:24:50.636 00:24:50.636 Run Summary: Type Total Ran Passed Failed Inactive 00:24:50.636 suites 1 1 n/a 0 0 00:24:50.636 tests 23 23 23 0 0 00:24:50.636 asserts 130 130 130 0 n/a 00:24:50.636 00:24:50.636 Elapsed time = 0.090 seconds 00:24:50.636 0 00:24:50.636 13:47:38 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:24:50.636 13:47:38 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:50.636 13:47:38 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:50.894 13:47:38 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:24:50.894 13:47:38 compress_compdev -- compress/compress.sh@62 -- # killprocess 112964 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 112964 ']' 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 112964 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 112964 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 112964' 00:24:50.894 killing process with pid 112964 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@967 -- # kill 112964 00:24:50.894 13:47:38 compress_compdev -- common/autotest_common.sh@972 -- # wait 112964 00:24:52.808 13:47:40 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:24:52.808 13:47:40 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:24:52.808 00:24:52.808 real 0m29.798s 00:24:52.808 user 1m7.901s 00:24:52.808 sys 0m4.675s 00:24:52.808 13:47:40 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:52.808 13:47:40 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:52.808 ************************************ 00:24:52.808 END TEST compress_compdev 00:24:52.808 ************************************ 00:24:52.808 13:47:40 -- common/autotest_common.sh@1142 -- # return 0 00:24:52.808 13:47:40 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:52.808 13:47:40 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:24:52.808 13:47:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:52.808 13:47:40 -- common/autotest_common.sh@10 -- # set +x 00:24:52.808 ************************************ 00:24:52.808 START TEST compress_isal 00:24:52.808 ************************************ 00:24:52.808 13:47:40 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:52.808 * Looking for test storage... 00:24:52.808 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00d40ca9-2a78-e711-906e-0017a4403562 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=00d40ca9-2a78-e711-906e-0017a4403562 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:52.808 13:47:40 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:52.808 13:47:40 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:52.808 13:47:40 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:52.808 13:47:40 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:52.808 13:47:40 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:52.808 13:47:40 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:52.808 13:47:40 compress_isal -- paths/export.sh@5 -- # export PATH 00:24:52.808 13:47:40 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@47 -- # : 0 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:52.808 13:47:40 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=113753 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@73 -- # waitforlisten 113753 00:24:52.808 13:47:40 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:52.808 13:47:40 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 113753 ']' 00:24:52.808 13:47:40 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:52.808 13:47:40 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:52.808 13:47:40 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:52.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:52.808 13:47:40 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:52.808 13:47:40 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:52.808 [2024-07-15 13:47:40.358585] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:24:52.808 [2024-07-15 13:47:40.358649] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid113753 ] 00:24:53.067 [2024-07-15 13:47:40.447337] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:53.067 [2024-07-15 13:47:40.528664] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:53.067 [2024-07-15 13:47:40.528668] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:53.632 13:47:41 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:53.632 13:47:41 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:53.632 13:47:41 compress_isal -- compress/compress.sh@74 -- # create_vols 00:24:53.632 13:47:41 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:53.632 13:47:41 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:54.199 13:47:41 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:54.199 13:47:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:54.199 13:47:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:54.199 13:47:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:54.199 13:47:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:54.199 13:47:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:54.199 13:47:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:54.456 13:47:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:54.456 [ 00:24:54.456 { 00:24:54.456 "name": "Nvme0n1", 00:24:54.456 "aliases": [ 00:24:54.456 "01000000-0000-0000-5cd2-e42bec7b5351" 00:24:54.456 ], 00:24:54.456 "product_name": "NVMe disk", 00:24:54.456 "block_size": 512, 00:24:54.456 "num_blocks": 7501476528, 00:24:54.456 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:24:54.456 "assigned_rate_limits": { 00:24:54.456 "rw_ios_per_sec": 0, 00:24:54.456 "rw_mbytes_per_sec": 0, 00:24:54.456 "r_mbytes_per_sec": 0, 00:24:54.456 "w_mbytes_per_sec": 0 00:24:54.456 }, 00:24:54.456 "claimed": false, 00:24:54.456 "zoned": false, 00:24:54.456 "supported_io_types": { 00:24:54.456 "read": true, 00:24:54.456 "write": true, 00:24:54.456 "unmap": true, 00:24:54.456 "flush": true, 00:24:54.456 "reset": true, 00:24:54.456 "nvme_admin": true, 00:24:54.456 "nvme_io": true, 00:24:54.456 "nvme_io_md": false, 00:24:54.456 "write_zeroes": true, 00:24:54.456 "zcopy": false, 00:24:54.456 "get_zone_info": false, 00:24:54.456 "zone_management": false, 00:24:54.456 "zone_append": false, 00:24:54.456 "compare": false, 00:24:54.456 "compare_and_write": false, 00:24:54.457 "abort": true, 00:24:54.457 "seek_hole": false, 00:24:54.457 "seek_data": false, 00:24:54.457 "copy": false, 00:24:54.457 "nvme_iov_md": false 00:24:54.457 }, 00:24:54.457 "driver_specific": { 00:24:54.457 "nvme": [ 00:24:54.457 { 00:24:54.457 "pci_address": "0000:5e:00.0", 00:24:54.457 "trid": { 00:24:54.457 "trtype": "PCIe", 00:24:54.457 "traddr": "0000:5e:00.0" 00:24:54.457 }, 00:24:54.457 "ctrlr_data": { 00:24:54.457 "cntlid": 0, 00:24:54.457 "vendor_id": "0x8086", 00:24:54.457 "model_number": "INTEL SSDPF2KX038T1", 00:24:54.457 "serial_number": "PHAX137100D13P8CGN", 00:24:54.457 "firmware_revision": "9CV10015", 00:24:54.457 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:24:54.457 "oacs": { 00:24:54.457 "security": 0, 00:24:54.457 "format": 1, 00:24:54.457 "firmware": 1, 00:24:54.457 "ns_manage": 1 00:24:54.457 }, 00:24:54.457 "multi_ctrlr": false, 00:24:54.457 "ana_reporting": false 00:24:54.457 }, 00:24:54.457 "vs": { 00:24:54.457 "nvme_version": "1.4" 00:24:54.457 }, 00:24:54.457 "ns_data": { 00:24:54.457 "id": 1, 00:24:54.457 "can_share": false 00:24:54.457 } 00:24:54.457 } 00:24:54.457 ], 00:24:54.457 "mp_policy": "active_passive" 00:24:54.457 } 00:24:54.457 } 00:24:54.457 ] 00:24:54.457 13:47:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:54.457 13:47:42 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:54.714 9ed0cc44-9347-403b-9b8d-20a5a6a0928b 00:24:54.714 13:47:42 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:54.972 e677c783-866b-4ea2-b56b-6cac9540e35b 00:24:54.972 13:47:42 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:54.972 13:47:42 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:54.972 13:47:42 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:54.972 13:47:42 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:54.972 13:47:42 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:54.972 13:47:42 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:54.972 13:47:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:54.972 13:47:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:55.230 [ 00:24:55.230 { 00:24:55.230 "name": "e677c783-866b-4ea2-b56b-6cac9540e35b", 00:24:55.230 "aliases": [ 00:24:55.230 "lvs0/lv0" 00:24:55.230 ], 00:24:55.230 "product_name": "Logical Volume", 00:24:55.230 "block_size": 512, 00:24:55.230 "num_blocks": 204800, 00:24:55.230 "uuid": "e677c783-866b-4ea2-b56b-6cac9540e35b", 00:24:55.230 "assigned_rate_limits": { 00:24:55.230 "rw_ios_per_sec": 0, 00:24:55.230 "rw_mbytes_per_sec": 0, 00:24:55.230 "r_mbytes_per_sec": 0, 00:24:55.230 "w_mbytes_per_sec": 0 00:24:55.230 }, 00:24:55.230 "claimed": false, 00:24:55.230 "zoned": false, 00:24:55.230 "supported_io_types": { 00:24:55.230 "read": true, 00:24:55.230 "write": true, 00:24:55.230 "unmap": true, 00:24:55.230 "flush": false, 00:24:55.230 "reset": true, 00:24:55.230 "nvme_admin": false, 00:24:55.230 "nvme_io": false, 00:24:55.230 "nvme_io_md": false, 00:24:55.230 "write_zeroes": true, 00:24:55.230 "zcopy": false, 00:24:55.230 "get_zone_info": false, 00:24:55.230 "zone_management": false, 00:24:55.230 "zone_append": false, 00:24:55.230 "compare": false, 00:24:55.230 "compare_and_write": false, 00:24:55.230 "abort": false, 00:24:55.230 "seek_hole": true, 00:24:55.230 "seek_data": true, 00:24:55.230 "copy": false, 00:24:55.230 "nvme_iov_md": false 00:24:55.230 }, 00:24:55.230 "driver_specific": { 00:24:55.230 "lvol": { 00:24:55.230 "lvol_store_uuid": "9ed0cc44-9347-403b-9b8d-20a5a6a0928b", 00:24:55.230 "base_bdev": "Nvme0n1", 00:24:55.230 "thin_provision": true, 00:24:55.230 "num_allocated_clusters": 0, 00:24:55.230 "snapshot": false, 00:24:55.230 "clone": false, 00:24:55.230 "esnap_clone": false 00:24:55.230 } 00:24:55.230 } 00:24:55.230 } 00:24:55.230 ] 00:24:55.230 13:47:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:55.230 13:47:42 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:55.230 13:47:42 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:55.489 [2024-07-15 13:47:42.915026] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:55.489 COMP_lvs0/lv0 00:24:55.489 13:47:42 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:55.489 13:47:42 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:55.489 13:47:42 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:55.489 13:47:42 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:55.489 13:47:42 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:55.489 13:47:42 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:55.489 13:47:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:55.747 13:47:43 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:55.747 [ 00:24:55.747 { 00:24:55.747 "name": "COMP_lvs0/lv0", 00:24:55.747 "aliases": [ 00:24:55.747 "44103c78-1a51-5f84-bd2f-d19bb67c8ad2" 00:24:55.747 ], 00:24:55.747 "product_name": "compress", 00:24:55.747 "block_size": 512, 00:24:55.747 "num_blocks": 200704, 00:24:55.747 "uuid": "44103c78-1a51-5f84-bd2f-d19bb67c8ad2", 00:24:55.747 "assigned_rate_limits": { 00:24:55.747 "rw_ios_per_sec": 0, 00:24:55.747 "rw_mbytes_per_sec": 0, 00:24:55.747 "r_mbytes_per_sec": 0, 00:24:55.747 "w_mbytes_per_sec": 0 00:24:55.747 }, 00:24:55.747 "claimed": false, 00:24:55.747 "zoned": false, 00:24:55.747 "supported_io_types": { 00:24:55.747 "read": true, 00:24:55.747 "write": true, 00:24:55.747 "unmap": false, 00:24:55.747 "flush": false, 00:24:55.747 "reset": false, 00:24:55.747 "nvme_admin": false, 00:24:55.747 "nvme_io": false, 00:24:55.747 "nvme_io_md": false, 00:24:55.747 "write_zeroes": true, 00:24:55.747 "zcopy": false, 00:24:55.747 "get_zone_info": false, 00:24:55.747 "zone_management": false, 00:24:55.747 "zone_append": false, 00:24:55.747 "compare": false, 00:24:55.747 "compare_and_write": false, 00:24:55.747 "abort": false, 00:24:55.747 "seek_hole": false, 00:24:55.747 "seek_data": false, 00:24:55.747 "copy": false, 00:24:55.747 "nvme_iov_md": false 00:24:55.747 }, 00:24:55.747 "driver_specific": { 00:24:55.747 "compress": { 00:24:55.747 "name": "COMP_lvs0/lv0", 00:24:55.747 "base_bdev_name": "e677c783-866b-4ea2-b56b-6cac9540e35b" 00:24:55.747 } 00:24:55.747 } 00:24:55.747 } 00:24:55.747 ] 00:24:55.747 13:47:43 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:55.747 13:47:43 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:56.004 Running I/O for 3 seconds... 00:24:59.284 00:24:59.284 Latency(us) 00:24:59.284 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:59.284 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:59.284 Verification LBA range: start 0x0 length 0x3100 00:24:59.284 COMP_lvs0/lv0 : 3.00 4042.55 15.79 0.00 0.00 7878.09 701.66 8662.15 00:24:59.284 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:59.284 Verification LBA range: start 0x3100 length 0x3100 00:24:59.284 COMP_lvs0/lv0 : 3.00 4047.75 15.81 0.00 0.00 7871.36 498.64 8662.15 00:24:59.284 =================================================================================================================== 00:24:59.284 Total : 8090.30 31.60 0.00 0.00 7874.72 498.64 8662.15 00:24:59.284 0 00:24:59.284 13:47:46 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:59.284 13:47:46 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:59.284 13:47:46 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:59.284 13:47:46 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:59.284 13:47:46 compress_isal -- compress/compress.sh@78 -- # killprocess 113753 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 113753 ']' 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@952 -- # kill -0 113753 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@953 -- # uname 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 113753 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 113753' 00:24:59.284 killing process with pid 113753 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@967 -- # kill 113753 00:24:59.284 Received shutdown signal, test time was about 3.000000 seconds 00:24:59.284 00:24:59.284 Latency(us) 00:24:59.284 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:59.284 =================================================================================================================== 00:24:59.284 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:59.284 13:47:46 compress_isal -- common/autotest_common.sh@972 -- # wait 113753 00:25:01.205 13:47:48 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:25:01.205 13:47:48 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:25:01.205 13:47:48 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=114832 00:25:01.205 13:47:48 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:01.205 13:47:48 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:25:01.205 13:47:48 compress_isal -- compress/compress.sh@73 -- # waitforlisten 114832 00:25:01.205 13:47:48 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 114832 ']' 00:25:01.205 13:47:48 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:01.205 13:47:48 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:01.205 13:47:48 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:01.206 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:01.206 13:47:48 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:01.206 13:47:48 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:01.206 [2024-07-15 13:47:48.495036] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:25:01.206 [2024-07-15 13:47:48.495098] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid114832 ] 00:25:01.206 [2024-07-15 13:47:48.582321] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:01.206 [2024-07-15 13:47:48.675533] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:25:01.206 [2024-07-15 13:47:48.675536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:01.769 13:47:49 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:01.769 13:47:49 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:25:01.769 13:47:49 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:25:01.769 13:47:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:01.769 13:47:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:02.343 13:47:49 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:02.343 13:47:49 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:25:02.343 13:47:49 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:02.343 13:47:49 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:02.343 13:47:49 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:02.343 13:47:49 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:02.343 13:47:49 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:02.600 13:47:50 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:02.600 [ 00:25:02.600 { 00:25:02.600 "name": "Nvme0n1", 00:25:02.600 "aliases": [ 00:25:02.600 "01000000-0000-0000-5cd2-e42bec7b5351" 00:25:02.600 ], 00:25:02.600 "product_name": "NVMe disk", 00:25:02.600 "block_size": 512, 00:25:02.600 "num_blocks": 7501476528, 00:25:02.600 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:25:02.600 "assigned_rate_limits": { 00:25:02.600 "rw_ios_per_sec": 0, 00:25:02.600 "rw_mbytes_per_sec": 0, 00:25:02.600 "r_mbytes_per_sec": 0, 00:25:02.600 "w_mbytes_per_sec": 0 00:25:02.600 }, 00:25:02.600 "claimed": false, 00:25:02.600 "zoned": false, 00:25:02.600 "supported_io_types": { 00:25:02.600 "read": true, 00:25:02.600 "write": true, 00:25:02.600 "unmap": true, 00:25:02.600 "flush": true, 00:25:02.600 "reset": true, 00:25:02.600 "nvme_admin": true, 00:25:02.600 "nvme_io": true, 00:25:02.600 "nvme_io_md": false, 00:25:02.600 "write_zeroes": true, 00:25:02.600 "zcopy": false, 00:25:02.600 "get_zone_info": false, 00:25:02.600 "zone_management": false, 00:25:02.600 "zone_append": false, 00:25:02.600 "compare": false, 00:25:02.600 "compare_and_write": false, 00:25:02.600 "abort": true, 00:25:02.600 "seek_hole": false, 00:25:02.600 "seek_data": false, 00:25:02.600 "copy": false, 00:25:02.600 "nvme_iov_md": false 00:25:02.600 }, 00:25:02.600 "driver_specific": { 00:25:02.600 "nvme": [ 00:25:02.600 { 00:25:02.600 "pci_address": "0000:5e:00.0", 00:25:02.600 "trid": { 00:25:02.600 "trtype": "PCIe", 00:25:02.600 "traddr": "0000:5e:00.0" 00:25:02.600 }, 00:25:02.600 "ctrlr_data": { 00:25:02.600 "cntlid": 0, 00:25:02.600 "vendor_id": "0x8086", 00:25:02.600 "model_number": "INTEL SSDPF2KX038T1", 00:25:02.600 "serial_number": "PHAX137100D13P8CGN", 00:25:02.600 "firmware_revision": "9CV10015", 00:25:02.600 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:25:02.600 "oacs": { 00:25:02.600 "security": 0, 00:25:02.600 "format": 1, 00:25:02.600 "firmware": 1, 00:25:02.600 "ns_manage": 1 00:25:02.600 }, 00:25:02.600 "multi_ctrlr": false, 00:25:02.600 "ana_reporting": false 00:25:02.600 }, 00:25:02.600 "vs": { 00:25:02.600 "nvme_version": "1.4" 00:25:02.600 }, 00:25:02.600 "ns_data": { 00:25:02.600 "id": 1, 00:25:02.600 "can_share": false 00:25:02.600 } 00:25:02.600 } 00:25:02.600 ], 00:25:02.600 "mp_policy": "active_passive" 00:25:02.600 } 00:25:02.600 } 00:25:02.600 ] 00:25:02.600 13:47:50 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:02.600 13:47:50 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:02.858 e0c11531-c968-40b6-af04-5447643b1c0a 00:25:02.858 13:47:50 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:03.116 fff34d86-6e07-4b9e-b106-18a084cbdd7c 00:25:03.116 13:47:50 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:03.116 13:47:50 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:25:03.116 13:47:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:03.116 13:47:50 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:03.116 13:47:50 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:03.116 13:47:50 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:03.116 13:47:50 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:03.374 13:47:50 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:03.374 [ 00:25:03.374 { 00:25:03.374 "name": "fff34d86-6e07-4b9e-b106-18a084cbdd7c", 00:25:03.374 "aliases": [ 00:25:03.374 "lvs0/lv0" 00:25:03.374 ], 00:25:03.374 "product_name": "Logical Volume", 00:25:03.374 "block_size": 512, 00:25:03.374 "num_blocks": 204800, 00:25:03.374 "uuid": "fff34d86-6e07-4b9e-b106-18a084cbdd7c", 00:25:03.374 "assigned_rate_limits": { 00:25:03.374 "rw_ios_per_sec": 0, 00:25:03.374 "rw_mbytes_per_sec": 0, 00:25:03.374 "r_mbytes_per_sec": 0, 00:25:03.374 "w_mbytes_per_sec": 0 00:25:03.374 }, 00:25:03.374 "claimed": false, 00:25:03.374 "zoned": false, 00:25:03.374 "supported_io_types": { 00:25:03.374 "read": true, 00:25:03.374 "write": true, 00:25:03.374 "unmap": true, 00:25:03.374 "flush": false, 00:25:03.374 "reset": true, 00:25:03.374 "nvme_admin": false, 00:25:03.374 "nvme_io": false, 00:25:03.374 "nvme_io_md": false, 00:25:03.374 "write_zeroes": true, 00:25:03.374 "zcopy": false, 00:25:03.374 "get_zone_info": false, 00:25:03.374 "zone_management": false, 00:25:03.374 "zone_append": false, 00:25:03.374 "compare": false, 00:25:03.374 "compare_and_write": false, 00:25:03.374 "abort": false, 00:25:03.374 "seek_hole": true, 00:25:03.374 "seek_data": true, 00:25:03.374 "copy": false, 00:25:03.374 "nvme_iov_md": false 00:25:03.374 }, 00:25:03.374 "driver_specific": { 00:25:03.374 "lvol": { 00:25:03.374 "lvol_store_uuid": "e0c11531-c968-40b6-af04-5447643b1c0a", 00:25:03.374 "base_bdev": "Nvme0n1", 00:25:03.374 "thin_provision": true, 00:25:03.374 "num_allocated_clusters": 0, 00:25:03.374 "snapshot": false, 00:25:03.374 "clone": false, 00:25:03.374 "esnap_clone": false 00:25:03.374 } 00:25:03.374 } 00:25:03.374 } 00:25:03.374 ] 00:25:03.374 13:47:50 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:03.374 13:47:50 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:25:03.374 13:47:50 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:25:03.632 [2024-07-15 13:47:51.071226] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:03.632 COMP_lvs0/lv0 00:25:03.632 13:47:51 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:03.632 13:47:51 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:25:03.632 13:47:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:03.632 13:47:51 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:03.632 13:47:51 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:03.632 13:47:51 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:03.632 13:47:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:03.890 13:47:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:03.890 [ 00:25:03.890 { 00:25:03.890 "name": "COMP_lvs0/lv0", 00:25:03.890 "aliases": [ 00:25:03.890 "1c5e7c08-56f1-57c2-810b-9355f2878096" 00:25:03.890 ], 00:25:03.890 "product_name": "compress", 00:25:03.890 "block_size": 512, 00:25:03.890 "num_blocks": 200704, 00:25:03.890 "uuid": "1c5e7c08-56f1-57c2-810b-9355f2878096", 00:25:03.890 "assigned_rate_limits": { 00:25:03.890 "rw_ios_per_sec": 0, 00:25:03.890 "rw_mbytes_per_sec": 0, 00:25:03.890 "r_mbytes_per_sec": 0, 00:25:03.890 "w_mbytes_per_sec": 0 00:25:03.890 }, 00:25:03.890 "claimed": false, 00:25:03.890 "zoned": false, 00:25:03.890 "supported_io_types": { 00:25:03.890 "read": true, 00:25:03.890 "write": true, 00:25:03.890 "unmap": false, 00:25:03.890 "flush": false, 00:25:03.890 "reset": false, 00:25:03.890 "nvme_admin": false, 00:25:03.890 "nvme_io": false, 00:25:03.890 "nvme_io_md": false, 00:25:03.890 "write_zeroes": true, 00:25:03.890 "zcopy": false, 00:25:03.890 "get_zone_info": false, 00:25:03.890 "zone_management": false, 00:25:03.890 "zone_append": false, 00:25:03.890 "compare": false, 00:25:03.890 "compare_and_write": false, 00:25:03.890 "abort": false, 00:25:03.890 "seek_hole": false, 00:25:03.890 "seek_data": false, 00:25:03.890 "copy": false, 00:25:03.890 "nvme_iov_md": false 00:25:03.890 }, 00:25:03.890 "driver_specific": { 00:25:03.890 "compress": { 00:25:03.890 "name": "COMP_lvs0/lv0", 00:25:03.890 "base_bdev_name": "fff34d86-6e07-4b9e-b106-18a084cbdd7c" 00:25:03.890 } 00:25:03.890 } 00:25:03.890 } 00:25:03.890 ] 00:25:03.890 13:47:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:03.890 13:47:51 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:04.148 Running I/O for 3 seconds... 00:25:07.432 00:25:07.432 Latency(us) 00:25:07.432 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:07.432 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:07.432 Verification LBA range: start 0x0 length 0x3100 00:25:07.432 COMP_lvs0/lv0 : 3.00 4054.27 15.84 0.00 0.00 7856.22 439.87 6810.05 00:25:07.432 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:07.433 Verification LBA range: start 0x3100 length 0x3100 00:25:07.433 COMP_lvs0/lv0 : 3.00 4056.93 15.85 0.00 0.00 7853.17 491.52 6667.58 00:25:07.433 =================================================================================================================== 00:25:07.433 Total : 8111.20 31.68 0.00 0.00 7854.69 439.87 6810.05 00:25:07.433 0 00:25:07.433 13:47:54 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:25:07.433 13:47:54 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:07.433 13:47:54 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:07.433 13:47:54 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:07.433 13:47:54 compress_isal -- compress/compress.sh@78 -- # killprocess 114832 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 114832 ']' 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@952 -- # kill -0 114832 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@953 -- # uname 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 114832 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 114832' 00:25:07.433 killing process with pid 114832 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@967 -- # kill 114832 00:25:07.433 Received shutdown signal, test time was about 3.000000 seconds 00:25:07.433 00:25:07.433 Latency(us) 00:25:07.433 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:07.433 =================================================================================================================== 00:25:07.433 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:07.433 13:47:54 compress_isal -- common/autotest_common.sh@972 -- # wait 114832 00:25:09.337 13:47:56 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:25:09.337 13:47:56 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:25:09.337 13:47:56 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=115909 00:25:09.337 13:47:56 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:09.337 13:47:56 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:25:09.337 13:47:56 compress_isal -- compress/compress.sh@73 -- # waitforlisten 115909 00:25:09.337 13:47:56 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 115909 ']' 00:25:09.337 13:47:56 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:09.337 13:47:56 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:09.337 13:47:56 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:09.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:09.337 13:47:56 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:09.337 13:47:56 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:09.337 [2024-07-15 13:47:56.609103] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:25:09.337 [2024-07-15 13:47:56.609160] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid115909 ] 00:25:09.337 [2024-07-15 13:47:56.696859] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:09.337 [2024-07-15 13:47:56.783323] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:25:09.337 [2024-07-15 13:47:56.783326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:09.902 13:47:57 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:09.902 13:47:57 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:25:09.902 13:47:57 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:25:09.902 13:47:57 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:09.902 13:47:57 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:10.574 13:47:57 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:10.574 13:47:57 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:25:10.574 13:47:57 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:10.574 13:47:57 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:10.574 13:47:57 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:10.574 13:47:57 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:10.574 13:47:57 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:10.574 13:47:58 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:10.832 [ 00:25:10.832 { 00:25:10.832 "name": "Nvme0n1", 00:25:10.832 "aliases": [ 00:25:10.832 "01000000-0000-0000-5cd2-e42bec7b5351" 00:25:10.832 ], 00:25:10.832 "product_name": "NVMe disk", 00:25:10.832 "block_size": 512, 00:25:10.832 "num_blocks": 7501476528, 00:25:10.832 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:25:10.832 "assigned_rate_limits": { 00:25:10.832 "rw_ios_per_sec": 0, 00:25:10.832 "rw_mbytes_per_sec": 0, 00:25:10.832 "r_mbytes_per_sec": 0, 00:25:10.832 "w_mbytes_per_sec": 0 00:25:10.832 }, 00:25:10.832 "claimed": false, 00:25:10.832 "zoned": false, 00:25:10.832 "supported_io_types": { 00:25:10.832 "read": true, 00:25:10.832 "write": true, 00:25:10.832 "unmap": true, 00:25:10.832 "flush": true, 00:25:10.832 "reset": true, 00:25:10.832 "nvme_admin": true, 00:25:10.832 "nvme_io": true, 00:25:10.832 "nvme_io_md": false, 00:25:10.832 "write_zeroes": true, 00:25:10.832 "zcopy": false, 00:25:10.832 "get_zone_info": false, 00:25:10.832 "zone_management": false, 00:25:10.832 "zone_append": false, 00:25:10.832 "compare": false, 00:25:10.832 "compare_and_write": false, 00:25:10.832 "abort": true, 00:25:10.832 "seek_hole": false, 00:25:10.832 "seek_data": false, 00:25:10.832 "copy": false, 00:25:10.832 "nvme_iov_md": false 00:25:10.832 }, 00:25:10.832 "driver_specific": { 00:25:10.832 "nvme": [ 00:25:10.832 { 00:25:10.832 "pci_address": "0000:5e:00.0", 00:25:10.832 "trid": { 00:25:10.832 "trtype": "PCIe", 00:25:10.832 "traddr": "0000:5e:00.0" 00:25:10.832 }, 00:25:10.832 "ctrlr_data": { 00:25:10.832 "cntlid": 0, 00:25:10.832 "vendor_id": "0x8086", 00:25:10.832 "model_number": "INTEL SSDPF2KX038T1", 00:25:10.832 "serial_number": "PHAX137100D13P8CGN", 00:25:10.832 "firmware_revision": "9CV10015", 00:25:10.832 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:25:10.832 "oacs": { 00:25:10.832 "security": 0, 00:25:10.832 "format": 1, 00:25:10.832 "firmware": 1, 00:25:10.832 "ns_manage": 1 00:25:10.832 }, 00:25:10.832 "multi_ctrlr": false, 00:25:10.832 "ana_reporting": false 00:25:10.832 }, 00:25:10.832 "vs": { 00:25:10.832 "nvme_version": "1.4" 00:25:10.832 }, 00:25:10.832 "ns_data": { 00:25:10.832 "id": 1, 00:25:10.832 "can_share": false 00:25:10.832 } 00:25:10.832 } 00:25:10.832 ], 00:25:10.832 "mp_policy": "active_passive" 00:25:10.832 } 00:25:10.832 } 00:25:10.832 ] 00:25:10.832 13:47:58 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:10.832 13:47:58 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:10.832 24ed4b8e-56ee-492c-81fd-e1e025933670 00:25:11.091 13:47:58 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:11.091 4911d4d9-981e-442a-95d2-11dbece7696c 00:25:11.091 13:47:58 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:11.091 13:47:58 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:25:11.091 13:47:58 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:11.091 13:47:58 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:11.091 13:47:58 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:11.091 13:47:58 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:11.091 13:47:58 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:11.349 13:47:58 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:11.608 [ 00:25:11.608 { 00:25:11.608 "name": "4911d4d9-981e-442a-95d2-11dbece7696c", 00:25:11.608 "aliases": [ 00:25:11.608 "lvs0/lv0" 00:25:11.608 ], 00:25:11.608 "product_name": "Logical Volume", 00:25:11.608 "block_size": 512, 00:25:11.608 "num_blocks": 204800, 00:25:11.608 "uuid": "4911d4d9-981e-442a-95d2-11dbece7696c", 00:25:11.608 "assigned_rate_limits": { 00:25:11.608 "rw_ios_per_sec": 0, 00:25:11.608 "rw_mbytes_per_sec": 0, 00:25:11.608 "r_mbytes_per_sec": 0, 00:25:11.608 "w_mbytes_per_sec": 0 00:25:11.608 }, 00:25:11.608 "claimed": false, 00:25:11.608 "zoned": false, 00:25:11.608 "supported_io_types": { 00:25:11.608 "read": true, 00:25:11.608 "write": true, 00:25:11.608 "unmap": true, 00:25:11.608 "flush": false, 00:25:11.608 "reset": true, 00:25:11.608 "nvme_admin": false, 00:25:11.608 "nvme_io": false, 00:25:11.608 "nvme_io_md": false, 00:25:11.608 "write_zeroes": true, 00:25:11.608 "zcopy": false, 00:25:11.608 "get_zone_info": false, 00:25:11.608 "zone_management": false, 00:25:11.608 "zone_append": false, 00:25:11.608 "compare": false, 00:25:11.608 "compare_and_write": false, 00:25:11.608 "abort": false, 00:25:11.608 "seek_hole": true, 00:25:11.608 "seek_data": true, 00:25:11.608 "copy": false, 00:25:11.608 "nvme_iov_md": false 00:25:11.608 }, 00:25:11.608 "driver_specific": { 00:25:11.608 "lvol": { 00:25:11.608 "lvol_store_uuid": "24ed4b8e-56ee-492c-81fd-e1e025933670", 00:25:11.608 "base_bdev": "Nvme0n1", 00:25:11.608 "thin_provision": true, 00:25:11.608 "num_allocated_clusters": 0, 00:25:11.608 "snapshot": false, 00:25:11.608 "clone": false, 00:25:11.608 "esnap_clone": false 00:25:11.608 } 00:25:11.608 } 00:25:11.608 } 00:25:11.608 ] 00:25:11.608 13:47:58 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:11.608 13:47:58 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:25:11.608 13:47:58 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:25:11.608 [2024-07-15 13:47:59.157378] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:11.608 COMP_lvs0/lv0 00:25:11.608 13:47:59 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:11.608 13:47:59 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:25:11.608 13:47:59 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:11.608 13:47:59 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:11.608 13:47:59 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:11.608 13:47:59 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:11.608 13:47:59 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:11.867 13:47:59 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:12.126 [ 00:25:12.126 { 00:25:12.126 "name": "COMP_lvs0/lv0", 00:25:12.126 "aliases": [ 00:25:12.126 "0ff188a2-cdd0-5f92-b609-7b872aad5817" 00:25:12.126 ], 00:25:12.126 "product_name": "compress", 00:25:12.126 "block_size": 4096, 00:25:12.126 "num_blocks": 25088, 00:25:12.126 "uuid": "0ff188a2-cdd0-5f92-b609-7b872aad5817", 00:25:12.126 "assigned_rate_limits": { 00:25:12.126 "rw_ios_per_sec": 0, 00:25:12.126 "rw_mbytes_per_sec": 0, 00:25:12.126 "r_mbytes_per_sec": 0, 00:25:12.126 "w_mbytes_per_sec": 0 00:25:12.126 }, 00:25:12.126 "claimed": false, 00:25:12.126 "zoned": false, 00:25:12.126 "supported_io_types": { 00:25:12.126 "read": true, 00:25:12.126 "write": true, 00:25:12.126 "unmap": false, 00:25:12.126 "flush": false, 00:25:12.126 "reset": false, 00:25:12.126 "nvme_admin": false, 00:25:12.126 "nvme_io": false, 00:25:12.126 "nvme_io_md": false, 00:25:12.126 "write_zeroes": true, 00:25:12.126 "zcopy": false, 00:25:12.126 "get_zone_info": false, 00:25:12.126 "zone_management": false, 00:25:12.126 "zone_append": false, 00:25:12.126 "compare": false, 00:25:12.126 "compare_and_write": false, 00:25:12.126 "abort": false, 00:25:12.126 "seek_hole": false, 00:25:12.126 "seek_data": false, 00:25:12.126 "copy": false, 00:25:12.126 "nvme_iov_md": false 00:25:12.126 }, 00:25:12.126 "driver_specific": { 00:25:12.126 "compress": { 00:25:12.126 "name": "COMP_lvs0/lv0", 00:25:12.126 "base_bdev_name": "4911d4d9-981e-442a-95d2-11dbece7696c" 00:25:12.126 } 00:25:12.126 } 00:25:12.126 } 00:25:12.126 ] 00:25:12.126 13:47:59 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:12.126 13:47:59 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:12.126 Running I/O for 3 seconds... 00:25:15.412 00:25:15.412 Latency(us) 00:25:15.412 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:15.412 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:15.412 Verification LBA range: start 0x0 length 0x3100 00:25:15.412 COMP_lvs0/lv0 : 3.00 4042.04 15.79 0.00 0.00 7880.44 459.46 7351.43 00:25:15.412 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:15.412 Verification LBA range: start 0x3100 length 0x3100 00:25:15.412 COMP_lvs0/lv0 : 3.00 4038.05 15.77 0.00 0.00 7888.08 591.25 7351.43 00:25:15.412 =================================================================================================================== 00:25:15.412 Total : 8080.09 31.56 0.00 0.00 7884.26 459.46 7351.43 00:25:15.412 0 00:25:15.412 13:48:02 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:25:15.412 13:48:02 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:15.412 13:48:02 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:15.412 13:48:03 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:15.412 13:48:03 compress_isal -- compress/compress.sh@78 -- # killprocess 115909 00:25:15.412 13:48:03 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 115909 ']' 00:25:15.412 13:48:03 compress_isal -- common/autotest_common.sh@952 -- # kill -0 115909 00:25:15.412 13:48:03 compress_isal -- common/autotest_common.sh@953 -- # uname 00:25:15.412 13:48:03 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:15.413 13:48:03 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 115909 00:25:15.670 13:48:03 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:25:15.670 13:48:03 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:25:15.670 13:48:03 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 115909' 00:25:15.670 killing process with pid 115909 00:25:15.670 13:48:03 compress_isal -- common/autotest_common.sh@967 -- # kill 115909 00:25:15.670 Received shutdown signal, test time was about 3.000000 seconds 00:25:15.670 00:25:15.670 Latency(us) 00:25:15.670 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:15.670 =================================================================================================================== 00:25:15.670 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:15.670 13:48:03 compress_isal -- common/autotest_common.sh@972 -- # wait 115909 00:25:17.569 13:48:04 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:25:17.569 13:48:04 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:25:17.569 13:48:04 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=117120 00:25:17.569 13:48:04 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:17.569 13:48:04 compress_isal -- compress/compress.sh@57 -- # waitforlisten 117120 00:25:17.569 13:48:04 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 117120 ']' 00:25:17.569 13:48:04 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:17.569 13:48:04 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:17.569 13:48:04 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:17.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:17.569 13:48:04 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:25:17.569 13:48:04 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:17.569 13:48:04 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:17.569 [2024-07-15 13:48:04.727756] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:25:17.569 [2024-07-15 13:48:04.727807] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid117120 ] 00:25:17.569 [2024-07-15 13:48:04.816866] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:17.569 [2024-07-15 13:48:04.910794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:17.569 [2024-07-15 13:48:04.910884] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:25:17.569 [2024-07-15 13:48:04.910886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:18.134 13:48:05 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:18.134 13:48:05 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:25:18.134 13:48:05 compress_isal -- compress/compress.sh@58 -- # create_vols 00:25:18.134 13:48:05 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:18.134 13:48:05 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:18.702 13:48:06 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:18.702 13:48:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:25:18.702 13:48:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:18.702 13:48:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:18.702 13:48:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:18.702 13:48:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:18.702 13:48:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:18.702 13:48:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:18.960 [ 00:25:18.960 { 00:25:18.960 "name": "Nvme0n1", 00:25:18.960 "aliases": [ 00:25:18.960 "01000000-0000-0000-5cd2-e42bec7b5351" 00:25:18.960 ], 00:25:18.960 "product_name": "NVMe disk", 00:25:18.960 "block_size": 512, 00:25:18.960 "num_blocks": 7501476528, 00:25:18.960 "uuid": "01000000-0000-0000-5cd2-e42bec7b5351", 00:25:18.960 "assigned_rate_limits": { 00:25:18.960 "rw_ios_per_sec": 0, 00:25:18.960 "rw_mbytes_per_sec": 0, 00:25:18.960 "r_mbytes_per_sec": 0, 00:25:18.960 "w_mbytes_per_sec": 0 00:25:18.960 }, 00:25:18.960 "claimed": false, 00:25:18.960 "zoned": false, 00:25:18.960 "supported_io_types": { 00:25:18.960 "read": true, 00:25:18.960 "write": true, 00:25:18.960 "unmap": true, 00:25:18.960 "flush": true, 00:25:18.960 "reset": true, 00:25:18.960 "nvme_admin": true, 00:25:18.960 "nvme_io": true, 00:25:18.960 "nvme_io_md": false, 00:25:18.960 "write_zeroes": true, 00:25:18.960 "zcopy": false, 00:25:18.960 "get_zone_info": false, 00:25:18.960 "zone_management": false, 00:25:18.960 "zone_append": false, 00:25:18.960 "compare": false, 00:25:18.960 "compare_and_write": false, 00:25:18.960 "abort": true, 00:25:18.960 "seek_hole": false, 00:25:18.960 "seek_data": false, 00:25:18.960 "copy": false, 00:25:18.960 "nvme_iov_md": false 00:25:18.960 }, 00:25:18.960 "driver_specific": { 00:25:18.960 "nvme": [ 00:25:18.960 { 00:25:18.960 "pci_address": "0000:5e:00.0", 00:25:18.960 "trid": { 00:25:18.960 "trtype": "PCIe", 00:25:18.960 "traddr": "0000:5e:00.0" 00:25:18.960 }, 00:25:18.960 "ctrlr_data": { 00:25:18.960 "cntlid": 0, 00:25:18.960 "vendor_id": "0x8086", 00:25:18.960 "model_number": "INTEL SSDPF2KX038T1", 00:25:18.960 "serial_number": "PHAX137100D13P8CGN", 00:25:18.960 "firmware_revision": "9CV10015", 00:25:18.960 "subnqn": "nqn.2021-09.com.intel:PHAX137100D13P8CGN ", 00:25:18.960 "oacs": { 00:25:18.960 "security": 0, 00:25:18.960 "format": 1, 00:25:18.960 "firmware": 1, 00:25:18.960 "ns_manage": 1 00:25:18.960 }, 00:25:18.960 "multi_ctrlr": false, 00:25:18.960 "ana_reporting": false 00:25:18.960 }, 00:25:18.960 "vs": { 00:25:18.960 "nvme_version": "1.4" 00:25:18.960 }, 00:25:18.960 "ns_data": { 00:25:18.960 "id": 1, 00:25:18.960 "can_share": false 00:25:18.960 } 00:25:18.960 } 00:25:18.960 ], 00:25:18.960 "mp_policy": "active_passive" 00:25:18.960 } 00:25:18.960 } 00:25:18.960 ] 00:25:18.960 13:48:06 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:18.960 13:48:06 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:19.219 0dc78de9-2cbd-4149-bafd-cfc9f4759a20 00:25:19.219 13:48:06 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:19.219 15a7bc46-6306-43e5-97ee-4ab9a3fb4373 00:25:19.219 13:48:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:19.219 13:48:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:25:19.219 13:48:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:19.219 13:48:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:19.219 13:48:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:19.219 13:48:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:19.219 13:48:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:19.485 13:48:07 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:19.743 [ 00:25:19.743 { 00:25:19.743 "name": "15a7bc46-6306-43e5-97ee-4ab9a3fb4373", 00:25:19.743 "aliases": [ 00:25:19.743 "lvs0/lv0" 00:25:19.743 ], 00:25:19.743 "product_name": "Logical Volume", 00:25:19.743 "block_size": 512, 00:25:19.743 "num_blocks": 204800, 00:25:19.743 "uuid": "15a7bc46-6306-43e5-97ee-4ab9a3fb4373", 00:25:19.743 "assigned_rate_limits": { 00:25:19.743 "rw_ios_per_sec": 0, 00:25:19.743 "rw_mbytes_per_sec": 0, 00:25:19.743 "r_mbytes_per_sec": 0, 00:25:19.743 "w_mbytes_per_sec": 0 00:25:19.743 }, 00:25:19.743 "claimed": false, 00:25:19.743 "zoned": false, 00:25:19.743 "supported_io_types": { 00:25:19.743 "read": true, 00:25:19.743 "write": true, 00:25:19.743 "unmap": true, 00:25:19.743 "flush": false, 00:25:19.743 "reset": true, 00:25:19.743 "nvme_admin": false, 00:25:19.743 "nvme_io": false, 00:25:19.743 "nvme_io_md": false, 00:25:19.743 "write_zeroes": true, 00:25:19.743 "zcopy": false, 00:25:19.743 "get_zone_info": false, 00:25:19.743 "zone_management": false, 00:25:19.743 "zone_append": false, 00:25:19.743 "compare": false, 00:25:19.743 "compare_and_write": false, 00:25:19.743 "abort": false, 00:25:19.743 "seek_hole": true, 00:25:19.743 "seek_data": true, 00:25:19.743 "copy": false, 00:25:19.743 "nvme_iov_md": false 00:25:19.743 }, 00:25:19.743 "driver_specific": { 00:25:19.743 "lvol": { 00:25:19.743 "lvol_store_uuid": "0dc78de9-2cbd-4149-bafd-cfc9f4759a20", 00:25:19.743 "base_bdev": "Nvme0n1", 00:25:19.743 "thin_provision": true, 00:25:19.743 "num_allocated_clusters": 0, 00:25:19.743 "snapshot": false, 00:25:19.743 "clone": false, 00:25:19.743 "esnap_clone": false 00:25:19.743 } 00:25:19.743 } 00:25:19.743 } 00:25:19.743 ] 00:25:19.743 13:48:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:19.743 13:48:07 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:25:19.743 13:48:07 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:25:19.743 [2024-07-15 13:48:07.358049] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:19.743 COMP_lvs0/lv0 00:25:20.001 13:48:07 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:20.001 13:48:07 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:25:20.001 13:48:07 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:20.001 13:48:07 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:20.001 13:48:07 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:20.001 13:48:07 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:20.001 13:48:07 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:20.001 13:48:07 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:20.259 [ 00:25:20.259 { 00:25:20.259 "name": "COMP_lvs0/lv0", 00:25:20.259 "aliases": [ 00:25:20.259 "2f5fd634-4426-5d23-acb9-add5dfae979d" 00:25:20.259 ], 00:25:20.259 "product_name": "compress", 00:25:20.259 "block_size": 512, 00:25:20.259 "num_blocks": 200704, 00:25:20.259 "uuid": "2f5fd634-4426-5d23-acb9-add5dfae979d", 00:25:20.259 "assigned_rate_limits": { 00:25:20.259 "rw_ios_per_sec": 0, 00:25:20.259 "rw_mbytes_per_sec": 0, 00:25:20.259 "r_mbytes_per_sec": 0, 00:25:20.259 "w_mbytes_per_sec": 0 00:25:20.259 }, 00:25:20.259 "claimed": false, 00:25:20.259 "zoned": false, 00:25:20.259 "supported_io_types": { 00:25:20.259 "read": true, 00:25:20.259 "write": true, 00:25:20.259 "unmap": false, 00:25:20.259 "flush": false, 00:25:20.259 "reset": false, 00:25:20.259 "nvme_admin": false, 00:25:20.259 "nvme_io": false, 00:25:20.259 "nvme_io_md": false, 00:25:20.259 "write_zeroes": true, 00:25:20.259 "zcopy": false, 00:25:20.259 "get_zone_info": false, 00:25:20.259 "zone_management": false, 00:25:20.259 "zone_append": false, 00:25:20.259 "compare": false, 00:25:20.259 "compare_and_write": false, 00:25:20.259 "abort": false, 00:25:20.259 "seek_hole": false, 00:25:20.259 "seek_data": false, 00:25:20.259 "copy": false, 00:25:20.259 "nvme_iov_md": false 00:25:20.259 }, 00:25:20.259 "driver_specific": { 00:25:20.259 "compress": { 00:25:20.259 "name": "COMP_lvs0/lv0", 00:25:20.259 "base_bdev_name": "15a7bc46-6306-43e5-97ee-4ab9a3fb4373" 00:25:20.259 } 00:25:20.259 } 00:25:20.259 } 00:25:20.259 ] 00:25:20.259 13:48:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:20.259 13:48:07 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:20.259 I/O targets: 00:25:20.259 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:25:20.259 00:25:20.259 00:25:20.259 CUnit - A unit testing framework for C - Version 2.1-3 00:25:20.259 http://cunit.sourceforge.net/ 00:25:20.259 00:25:20.259 00:25:20.259 Suite: bdevio tests on: COMP_lvs0/lv0 00:25:20.259 Test: blockdev write read block ...passed 00:25:20.259 Test: blockdev write zeroes read block ...passed 00:25:20.259 Test: blockdev write zeroes read no split ...passed 00:25:20.259 Test: blockdev write zeroes read split ...passed 00:25:20.517 Test: blockdev write zeroes read split partial ...passed 00:25:20.517 Test: blockdev reset ...[2024-07-15 13:48:07.886302] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:25:20.517 passed 00:25:20.517 Test: blockdev write read 8 blocks ...passed 00:25:20.517 Test: blockdev write read size > 128k ...passed 00:25:20.517 Test: blockdev write read invalid size ...passed 00:25:20.517 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:20.517 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:20.517 Test: blockdev write read max offset ...passed 00:25:20.517 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:20.517 Test: blockdev writev readv 8 blocks ...passed 00:25:20.517 Test: blockdev writev readv 30 x 1block ...passed 00:25:20.517 Test: blockdev writev readv block ...passed 00:25:20.517 Test: blockdev writev readv size > 128k ...passed 00:25:20.517 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:20.517 Test: blockdev comparev and writev ...passed 00:25:20.517 Test: blockdev nvme passthru rw ...passed 00:25:20.517 Test: blockdev nvme passthru vendor specific ...passed 00:25:20.517 Test: blockdev nvme admin passthru ...passed 00:25:20.517 Test: blockdev copy ...passed 00:25:20.517 00:25:20.517 Run Summary: Type Total Ran Passed Failed Inactive 00:25:20.517 suites 1 1 n/a 0 0 00:25:20.517 tests 23 23 23 0 0 00:25:20.517 asserts 130 130 130 0 n/a 00:25:20.517 00:25:20.517 Elapsed time = 0.110 seconds 00:25:20.517 0 00:25:20.517 13:48:07 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:25:20.517 13:48:07 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:20.517 13:48:08 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:20.775 13:48:08 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:25:20.775 13:48:08 compress_isal -- compress/compress.sh@62 -- # killprocess 117120 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 117120 ']' 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@952 -- # kill -0 117120 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@953 -- # uname 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 117120 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 117120' 00:25:20.775 killing process with pid 117120 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@967 -- # kill 117120 00:25:20.775 13:48:08 compress_isal -- common/autotest_common.sh@972 -- # wait 117120 00:25:22.679 13:48:09 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:25:22.679 13:48:09 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:25:22.679 00:25:22.679 real 0m29.772s 00:25:22.679 user 1m8.890s 00:25:22.679 sys 0m3.641s 00:25:22.679 13:48:09 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:22.679 13:48:09 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:22.679 ************************************ 00:25:22.679 END TEST compress_isal 00:25:22.679 ************************************ 00:25:22.679 13:48:09 -- common/autotest_common.sh@1142 -- # return 0 00:25:22.679 13:48:09 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:25:22.679 13:48:09 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:25:22.679 13:48:09 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:25:22.679 13:48:09 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:22.679 13:48:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:22.679 13:48:09 -- common/autotest_common.sh@10 -- # set +x 00:25:22.679 ************************************ 00:25:22.679 START TEST blockdev_crypto_aesni 00:25:22.679 ************************************ 00:25:22.679 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:25:22.679 * Looking for test storage... 00:25:22.679 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:25:22.679 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=118284 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 118284 00:25:22.680 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:25:22.680 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 118284 ']' 00:25:22.680 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:22.680 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:22.680 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:22.680 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:22.680 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:22.680 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:22.680 [2024-07-15 13:48:10.155602] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:25:22.680 [2024-07-15 13:48:10.155659] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid118284 ] 00:25:22.680 [2024-07-15 13:48:10.239859] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:22.939 [2024-07-15 13:48:10.323028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:23.506 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:23.506 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:25:23.506 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:25:23.506 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:25:23.506 13:48:10 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:25:23.506 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:23.506 13:48:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:23.506 [2024-07-15 13:48:10.972977] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:23.506 [2024-07-15 13:48:10.981012] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:23.506 [2024-07-15 13:48:10.989027] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:23.506 [2024-07-15 13:48:11.056388] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:26.040 true 00:25:26.040 true 00:25:26.040 true 00:25:26.040 true 00:25:26.040 Malloc0 00:25:26.040 Malloc1 00:25:26.040 Malloc2 00:25:26.040 Malloc3 00:25:26.040 [2024-07-15 13:48:13.387830] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:26.040 crypto_ram 00:25:26.040 [2024-07-15 13:48:13.395845] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:26.040 crypto_ram2 00:25:26.040 [2024-07-15 13:48:13.403864] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:26.040 crypto_ram3 00:25:26.040 [2024-07-15 13:48:13.411885] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:26.040 crypto_ram4 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "27a8969c-f227-530f-ad3b-cbe05b32c610"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "27a8969c-f227-530f-ad3b-cbe05b32c610",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "85829125-c180-5331-8d99-15ffb22db12e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "85829125-c180-5331-8d99-15ffb22db12e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1b139d05-18ff-53b3-b617-7ab042a062d4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1b139d05-18ff-53b3-b617-7ab042a062d4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1d6c75a6-efac-5ff1-a16c-891799d1075e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1d6c75a6-efac-5ff1-a16c-891799d1075e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:25:26.040 13:48:13 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 118284 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 118284 ']' 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 118284 00:25:26.040 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:25:26.041 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:26.041 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 118284 00:25:26.041 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:26.041 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:26.041 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 118284' 00:25:26.041 killing process with pid 118284 00:25:26.041 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 118284 00:25:26.041 13:48:13 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 118284 00:25:26.607 13:48:14 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:25:26.607 13:48:14 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:26.607 13:48:14 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:26.607 13:48:14 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:26.607 13:48:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:26.607 ************************************ 00:25:26.607 START TEST bdev_hello_world 00:25:26.607 ************************************ 00:25:26.607 13:48:14 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:26.865 [2024-07-15 13:48:14.246452] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:25:26.865 [2024-07-15 13:48:14.246492] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid118829 ] 00:25:26.865 [2024-07-15 13:48:14.331351] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:26.865 [2024-07-15 13:48:14.415751] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:26.865 [2024-07-15 13:48:14.436690] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:26.865 [2024-07-15 13:48:14.444716] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:26.865 [2024-07-15 13:48:14.452736] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:27.124 [2024-07-15 13:48:14.554191] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:29.651 [2024-07-15 13:48:16.751047] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:29.651 [2024-07-15 13:48:16.751105] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:29.651 [2024-07-15 13:48:16.751115] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:29.651 [2024-07-15 13:48:16.759066] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:29.651 [2024-07-15 13:48:16.759081] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:29.651 [2024-07-15 13:48:16.759089] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:29.651 [2024-07-15 13:48:16.767086] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:29.651 [2024-07-15 13:48:16.767100] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:29.651 [2024-07-15 13:48:16.767107] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:29.651 [2024-07-15 13:48:16.775106] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:29.651 [2024-07-15 13:48:16.775121] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:29.651 [2024-07-15 13:48:16.775128] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:29.651 [2024-07-15 13:48:16.844082] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:25:29.651 [2024-07-15 13:48:16.844115] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:25:29.651 [2024-07-15 13:48:16.844128] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:25:29.651 [2024-07-15 13:48:16.845032] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:25:29.651 [2024-07-15 13:48:16.845087] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:25:29.651 [2024-07-15 13:48:16.845098] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:25:29.651 [2024-07-15 13:48:16.845130] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:25:29.651 00:25:29.651 [2024-07-15 13:48:16.845147] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:25:29.651 00:25:29.651 real 0m2.973s 00:25:29.651 user 0m2.612s 00:25:29.651 sys 0m0.329s 00:25:29.651 13:48:17 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:25:29.652 ************************************ 00:25:29.652 END TEST bdev_hello_world 00:25:29.652 ************************************ 00:25:29.652 13:48:17 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:25:29.652 13:48:17 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:25:29.652 13:48:17 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:29.652 13:48:17 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:29.652 13:48:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:29.652 ************************************ 00:25:29.652 START TEST bdev_bounds 00:25:29.652 ************************************ 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=119203 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 119203' 00:25:29.652 Process bdevio pid: 119203 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 119203 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 119203 ']' 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:29.652 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:29.652 13:48:17 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:29.910 [2024-07-15 13:48:17.298371] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:25:29.910 [2024-07-15 13:48:17.298420] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid119203 ] 00:25:29.910 [2024-07-15 13:48:17.384652] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:29.910 [2024-07-15 13:48:17.473671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:29.910 [2024-07-15 13:48:17.473759] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:25:29.910 [2024-07-15 13:48:17.473761] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.911 [2024-07-15 13:48:17.494755] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:29.911 [2024-07-15 13:48:17.502784] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:29.911 [2024-07-15 13:48:17.510805] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:30.169 [2024-07-15 13:48:17.610228] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:32.702 [2024-07-15 13:48:19.798879] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:32.702 [2024-07-15 13:48:19.798963] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:32.702 [2024-07-15 13:48:19.798978] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:32.702 [2024-07-15 13:48:19.806898] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:32.702 [2024-07-15 13:48:19.806911] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:32.702 [2024-07-15 13:48:19.806919] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:32.702 [2024-07-15 13:48:19.814918] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:32.702 [2024-07-15 13:48:19.814929] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:32.702 [2024-07-15 13:48:19.814937] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:32.702 [2024-07-15 13:48:19.822943] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:32.702 [2024-07-15 13:48:19.822956] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:32.702 [2024-07-15 13:48:19.822964] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:32.702 13:48:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:32.702 13:48:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:25:32.702 13:48:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:32.702 I/O targets: 00:25:32.702 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:25:32.702 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:25:32.702 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:25:32.702 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:25:32.702 00:25:32.702 00:25:32.702 CUnit - A unit testing framework for C - Version 2.1-3 00:25:32.702 http://cunit.sourceforge.net/ 00:25:32.702 00:25:32.702 00:25:32.702 Suite: bdevio tests on: crypto_ram4 00:25:32.702 Test: blockdev write read block ...passed 00:25:32.702 Test: blockdev write zeroes read block ...passed 00:25:32.702 Test: blockdev write zeroes read no split ...passed 00:25:32.702 Test: blockdev write zeroes read split ...passed 00:25:32.703 Test: blockdev write zeroes read split partial ...passed 00:25:32.703 Test: blockdev reset ...passed 00:25:32.703 Test: blockdev write read 8 blocks ...passed 00:25:32.703 Test: blockdev write read size > 128k ...passed 00:25:32.703 Test: blockdev write read invalid size ...passed 00:25:32.703 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:32.703 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:32.703 Test: blockdev write read max offset ...passed 00:25:32.703 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:32.703 Test: blockdev writev readv 8 blocks ...passed 00:25:32.703 Test: blockdev writev readv 30 x 1block ...passed 00:25:32.703 Test: blockdev writev readv block ...passed 00:25:32.703 Test: blockdev writev readv size > 128k ...passed 00:25:32.703 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:32.703 Test: blockdev comparev and writev ...passed 00:25:32.703 Test: blockdev nvme passthru rw ...passed 00:25:32.703 Test: blockdev nvme passthru vendor specific ...passed 00:25:32.703 Test: blockdev nvme admin passthru ...passed 00:25:32.703 Test: blockdev copy ...passed 00:25:32.703 Suite: bdevio tests on: crypto_ram3 00:25:32.703 Test: blockdev write read block ...passed 00:25:32.703 Test: blockdev write zeroes read block ...passed 00:25:32.703 Test: blockdev write zeroes read no split ...passed 00:25:32.703 Test: blockdev write zeroes read split ...passed 00:25:32.703 Test: blockdev write zeroes read split partial ...passed 00:25:32.703 Test: blockdev reset ...passed 00:25:32.703 Test: blockdev write read 8 blocks ...passed 00:25:32.703 Test: blockdev write read size > 128k ...passed 00:25:32.703 Test: blockdev write read invalid size ...passed 00:25:32.703 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:32.703 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:32.703 Test: blockdev write read max offset ...passed 00:25:32.703 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:32.703 Test: blockdev writev readv 8 blocks ...passed 00:25:32.703 Test: blockdev writev readv 30 x 1block ...passed 00:25:32.703 Test: blockdev writev readv block ...passed 00:25:32.703 Test: blockdev writev readv size > 128k ...passed 00:25:32.703 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:32.703 Test: blockdev comparev and writev ...passed 00:25:32.703 Test: blockdev nvme passthru rw ...passed 00:25:32.703 Test: blockdev nvme passthru vendor specific ...passed 00:25:32.703 Test: blockdev nvme admin passthru ...passed 00:25:32.703 Test: blockdev copy ...passed 00:25:32.703 Suite: bdevio tests on: crypto_ram2 00:25:32.703 Test: blockdev write read block ...passed 00:25:32.703 Test: blockdev write zeroes read block ...passed 00:25:32.703 Test: blockdev write zeroes read no split ...passed 00:25:32.703 Test: blockdev write zeroes read split ...passed 00:25:32.703 Test: blockdev write zeroes read split partial ...passed 00:25:32.703 Test: blockdev reset ...passed 00:25:32.703 Test: blockdev write read 8 blocks ...passed 00:25:32.703 Test: blockdev write read size > 128k ...passed 00:25:32.703 Test: blockdev write read invalid size ...passed 00:25:32.703 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:32.703 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:32.703 Test: blockdev write read max offset ...passed 00:25:32.703 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:32.703 Test: blockdev writev readv 8 blocks ...passed 00:25:32.703 Test: blockdev writev readv 30 x 1block ...passed 00:25:32.703 Test: blockdev writev readv block ...passed 00:25:32.703 Test: blockdev writev readv size > 128k ...passed 00:25:32.703 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:32.703 Test: blockdev comparev and writev ...passed 00:25:32.703 Test: blockdev nvme passthru rw ...passed 00:25:32.703 Test: blockdev nvme passthru vendor specific ...passed 00:25:32.703 Test: blockdev nvme admin passthru ...passed 00:25:32.703 Test: blockdev copy ...passed 00:25:32.703 Suite: bdevio tests on: crypto_ram 00:25:32.703 Test: blockdev write read block ...passed 00:25:32.703 Test: blockdev write zeroes read block ...passed 00:25:32.703 Test: blockdev write zeroes read no split ...passed 00:25:32.703 Test: blockdev write zeroes read split ...passed 00:25:32.703 Test: blockdev write zeroes read split partial ...passed 00:25:32.703 Test: blockdev reset ...passed 00:25:32.703 Test: blockdev write read 8 blocks ...passed 00:25:32.703 Test: blockdev write read size > 128k ...passed 00:25:32.703 Test: blockdev write read invalid size ...passed 00:25:32.703 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:32.703 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:32.703 Test: blockdev write read max offset ...passed 00:25:32.703 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:32.703 Test: blockdev writev readv 8 blocks ...passed 00:25:32.703 Test: blockdev writev readv 30 x 1block ...passed 00:25:32.703 Test: blockdev writev readv block ...passed 00:25:32.703 Test: blockdev writev readv size > 128k ...passed 00:25:32.703 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:32.703 Test: blockdev comparev and writev ...passed 00:25:32.703 Test: blockdev nvme passthru rw ...passed 00:25:32.703 Test: blockdev nvme passthru vendor specific ...passed 00:25:32.703 Test: blockdev nvme admin passthru ...passed 00:25:32.703 Test: blockdev copy ...passed 00:25:32.703 00:25:32.703 Run Summary: Type Total Ran Passed Failed Inactive 00:25:32.703 suites 4 4 n/a 0 0 00:25:32.703 tests 92 92 92 0 0 00:25:32.703 asserts 520 520 520 0 n/a 00:25:32.703 00:25:32.703 Elapsed time = 0.543 seconds 00:25:32.703 0 00:25:32.703 13:48:20 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 119203 00:25:32.703 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 119203 ']' 00:25:32.703 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 119203 00:25:32.703 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:25:32.703 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:32.703 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 119203 00:25:32.961 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:32.961 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:32.961 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 119203' 00:25:32.961 killing process with pid 119203 00:25:32.961 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 119203 00:25:32.961 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 119203 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:25:33.220 00:25:33.220 real 0m3.466s 00:25:33.220 user 0m9.654s 00:25:33.220 sys 0m0.498s 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:33.220 ************************************ 00:25:33.220 END TEST bdev_bounds 00:25:33.220 ************************************ 00:25:33.220 13:48:20 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:25:33.220 13:48:20 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:25:33.220 13:48:20 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:33.220 13:48:20 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:33.220 13:48:20 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:33.220 ************************************ 00:25:33.220 START TEST bdev_nbd 00:25:33.220 ************************************ 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=119674 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 119674 /var/tmp/spdk-nbd.sock 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 119674 ']' 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:25:33.220 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:33.220 13:48:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:33.479 [2024-07-15 13:48:20.857852] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:25:33.479 [2024-07-15 13:48:20.857900] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:33.479 [2024-07-15 13:48:20.945324] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:33.479 [2024-07-15 13:48:21.034531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:33.479 [2024-07-15 13:48:21.055459] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:33.479 [2024-07-15 13:48:21.063480] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:33.479 [2024-07-15 13:48:21.071498] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:33.737 [2024-07-15 13:48:21.169288] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:36.276 [2024-07-15 13:48:23.361417] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:36.276 [2024-07-15 13:48:23.361472] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:36.276 [2024-07-15 13:48:23.361483] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:36.276 [2024-07-15 13:48:23.369435] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:36.276 [2024-07-15 13:48:23.369449] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:36.276 [2024-07-15 13:48:23.369457] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:36.276 [2024-07-15 13:48:23.377455] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:36.276 [2024-07-15 13:48:23.377468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:36.276 [2024-07-15 13:48:23.377475] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:36.276 [2024-07-15 13:48:23.385476] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:36.276 [2024-07-15 13:48:23.385488] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:36.276 [2024-07-15 13:48:23.385495] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.276 1+0 records in 00:25:36.276 1+0 records out 00:25:36.276 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239972 s, 17.1 MB/s 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:25:36.276 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.534 1+0 records in 00:25:36.534 1+0 records out 00:25:36.534 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255958 s, 16.0 MB/s 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:36.534 13:48:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.534 1+0 records in 00:25:36.534 1+0 records out 00:25:36.534 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000168496 s, 24.3 MB/s 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:36.534 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.792 1+0 records in 00:25:36.792 1+0 records out 00:25:36.792 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261185 s, 15.7 MB/s 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:36.792 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd0", 00:25:37.049 "bdev_name": "crypto_ram" 00:25:37.049 }, 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd1", 00:25:37.049 "bdev_name": "crypto_ram2" 00:25:37.049 }, 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd2", 00:25:37.049 "bdev_name": "crypto_ram3" 00:25:37.049 }, 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd3", 00:25:37.049 "bdev_name": "crypto_ram4" 00:25:37.049 } 00:25:37.049 ]' 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd0", 00:25:37.049 "bdev_name": "crypto_ram" 00:25:37.049 }, 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd1", 00:25:37.049 "bdev_name": "crypto_ram2" 00:25:37.049 }, 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd2", 00:25:37.049 "bdev_name": "crypto_ram3" 00:25:37.049 }, 00:25:37.049 { 00:25:37.049 "nbd_device": "/dev/nbd3", 00:25:37.049 "bdev_name": "crypto_ram4" 00:25:37.049 } 00:25:37.049 ]' 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:37.049 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:37.306 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:37.563 13:48:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:37.564 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:37.820 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:38.078 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:25:38.337 /dev/nbd0 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:38.337 1+0 records in 00:25:38.337 1+0 records out 00:25:38.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247807 s, 16.5 MB/s 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:38.337 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:25:38.594 /dev/nbd1 00:25:38.594 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:38.594 13:48:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:38.595 1+0 records in 00:25:38.595 1+0 records out 00:25:38.595 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311022 s, 13.2 MB/s 00:25:38.595 13:48:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:25:38.595 /dev/nbd10 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:38.595 1+0 records in 00:25:38.595 1+0 records out 00:25:38.595 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000296608 s, 13.8 MB/s 00:25:38.595 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:25:38.853 /dev/nbd11 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:38.853 1+0 records in 00:25:38.853 1+0 records out 00:25:38.853 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290387 s, 14.1 MB/s 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:38.853 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd0", 00:25:39.110 "bdev_name": "crypto_ram" 00:25:39.110 }, 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd1", 00:25:39.110 "bdev_name": "crypto_ram2" 00:25:39.110 }, 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd10", 00:25:39.110 "bdev_name": "crypto_ram3" 00:25:39.110 }, 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd11", 00:25:39.110 "bdev_name": "crypto_ram4" 00:25:39.110 } 00:25:39.110 ]' 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd0", 00:25:39.110 "bdev_name": "crypto_ram" 00:25:39.110 }, 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd1", 00:25:39.110 "bdev_name": "crypto_ram2" 00:25:39.110 }, 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd10", 00:25:39.110 "bdev_name": "crypto_ram3" 00:25:39.110 }, 00:25:39.110 { 00:25:39.110 "nbd_device": "/dev/nbd11", 00:25:39.110 "bdev_name": "crypto_ram4" 00:25:39.110 } 00:25:39.110 ]' 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:25:39.110 /dev/nbd1 00:25:39.110 /dev/nbd10 00:25:39.110 /dev/nbd11' 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:25:39.110 /dev/nbd1 00:25:39.110 /dev/nbd10 00:25:39.110 /dev/nbd11' 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:25:39.110 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:25:39.110 256+0 records in 00:25:39.111 256+0 records out 00:25:39.111 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00500758 s, 209 MB/s 00:25:39.111 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:39.111 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:25:39.111 256+0 records in 00:25:39.111 256+0 records out 00:25:39.111 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0413418 s, 25.4 MB/s 00:25:39.111 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:39.111 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:25:39.369 256+0 records in 00:25:39.369 256+0 records out 00:25:39.369 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0440897 s, 23.8 MB/s 00:25:39.369 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:39.369 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:25:39.369 256+0 records in 00:25:39.369 256+0 records out 00:25:39.369 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0338224 s, 31.0 MB/s 00:25:39.369 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:39.369 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:25:39.369 256+0 records in 00:25:39.369 256+0 records out 00:25:39.369 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0295952 s, 35.4 MB/s 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.370 13:48:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.628 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.887 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:25:40.196 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:25:40.196 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:40.197 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:25:40.455 13:48:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:25:40.455 malloc_lvol_verify 00:25:40.455 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:25:40.714 d6220323-ccd8-48dd-92da-8b117115ad90 00:25:40.714 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:25:40.973 96c0047a-dd29-408d-885e-4bdea806ba73 00:25:40.973 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:25:40.973 /dev/nbd0 00:25:40.973 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:25:40.973 mke2fs 1.46.5 (30-Dec-2021) 00:25:40.973 Discarding device blocks: 0/4096 done 00:25:40.973 Creating filesystem with 4096 1k blocks and 1024 inodes 00:25:40.973 00:25:40.973 Allocating group tables: 0/1 done 00:25:40.973 Writing inode tables: 0/1 done 00:25:41.231 Creating journal (1024 blocks): done 00:25:41.231 Writing superblocks and filesystem accounting information: 0/1 done 00:25:41.231 00:25:41.231 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:25:41.231 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:25:41.231 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:41.231 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:41.231 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 119674 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 119674 ']' 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 119674 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 119674 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 119674' 00:25:41.232 killing process with pid 119674 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 119674 00:25:41.232 13:48:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 119674 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:25:41.800 00:25:41.800 real 0m8.416s 00:25:41.800 user 0m10.489s 00:25:41.800 sys 0m3.227s 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:41.800 ************************************ 00:25:41.800 END TEST bdev_nbd 00:25:41.800 ************************************ 00:25:41.800 13:48:29 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:25:41.800 13:48:29 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:25:41.800 13:48:29 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:25:41.800 13:48:29 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:41.800 13:48:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:41.800 ************************************ 00:25:41.800 START TEST bdev_fio 00:25:41.800 ************************************ 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:41.800 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:41.800 13:48:29 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:41.800 ************************************ 00:25:41.800 START TEST bdev_fio_rw_verify 00:25:41.800 ************************************ 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:41.801 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:42.059 13:48:29 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:42.318 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:42.318 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:42.318 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:42.318 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:42.318 fio-3.35 00:25:42.318 Starting 4 threads 00:25:57.191 00:25:57.191 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=121452: Mon Jul 15 13:48:42 2024 00:25:57.191 read: IOPS=28.0k, BW=109MiB/s (115MB/s)(1092MiB/10001msec) 00:25:57.191 slat (usec): min=11, max=452, avg=48.94, stdev=37.24 00:25:57.191 clat (usec): min=9, max=2087, avg=261.49, stdev=202.34 00:25:57.191 lat (usec): min=36, max=2381, avg=310.42, stdev=227.57 00:25:57.191 clat percentiles (usec): 00:25:57.191 | 50.000th=[ 212], 99.000th=[ 1090], 99.900th=[ 1336], 99.990th=[ 1516], 00:25:57.191 | 99.999th=[ 1991] 00:25:57.191 write: IOPS=30.7k, BW=120MiB/s (126MB/s)(1171MiB/9753msec); 0 zone resets 00:25:57.191 slat (usec): min=17, max=437, avg=57.47, stdev=36.52 00:25:57.191 clat (usec): min=23, max=1951, avg=310.20, stdev=231.47 00:25:57.191 lat (usec): min=45, max=2093, avg=367.67, stdev=256.06 00:25:57.191 clat percentiles (usec): 00:25:57.191 | 50.000th=[ 260], 99.000th=[ 1270], 99.900th=[ 1582], 99.990th=[ 1745], 00:25:57.191 | 99.999th=[ 1909] 00:25:57.191 bw ( KiB/s): min=105920, max=139144, per=97.70%, avg=120150.74, stdev=2272.64, samples=76 00:25:57.191 iops : min=26480, max=34786, avg=30037.68, stdev=568.16, samples=76 00:25:57.191 lat (usec) : 10=0.01%, 20=0.01%, 50=1.69%, 100=10.87%, 250=42.23% 00:25:57.191 lat (usec) : 500=33.58%, 750=6.81%, 1000=2.76% 00:25:57.191 lat (msec) : 2=2.06%, 4=0.01% 00:25:57.191 cpu : usr=99.65%, sys=0.00%, ctx=107, majf=0, minf=318 00:25:57.191 IO depths : 1=10.4%, 2=25.5%, 4=51.0%, 8=13.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:25:57.191 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:57.191 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:57.191 issued rwts: total=279578,299864,0,0 short=0,0,0,0 dropped=0,0,0,0 00:25:57.191 latency : target=0, window=0, percentile=100.00%, depth=8 00:25:57.191 00:25:57.191 Run status group 0 (all jobs): 00:25:57.191 READ: bw=109MiB/s (115MB/s), 109MiB/s-109MiB/s (115MB/s-115MB/s), io=1092MiB (1145MB), run=10001-10001msec 00:25:57.191 WRITE: bw=120MiB/s (126MB/s), 120MiB/s-120MiB/s (126MB/s-126MB/s), io=1171MiB (1228MB), run=9753-9753msec 00:25:57.191 00:25:57.191 real 0m13.452s 00:25:57.191 user 0m45.693s 00:25:57.191 sys 0m0.485s 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:25:57.191 ************************************ 00:25:57.191 END TEST bdev_fio_rw_verify 00:25:57.191 ************************************ 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:25:57.191 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:57.192 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "27a8969c-f227-530f-ad3b-cbe05b32c610"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "27a8969c-f227-530f-ad3b-cbe05b32c610",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "85829125-c180-5331-8d99-15ffb22db12e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "85829125-c180-5331-8d99-15ffb22db12e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1b139d05-18ff-53b3-b617-7ab042a062d4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1b139d05-18ff-53b3-b617-7ab042a062d4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1d6c75a6-efac-5ff1-a16c-891799d1075e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1d6c75a6-efac-5ff1-a16c-891799d1075e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:57.192 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:25:57.192 crypto_ram2 00:25:57.192 crypto_ram3 00:25:57.192 crypto_ram4 ]] 00:25:57.192 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:57.192 13:48:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "27a8969c-f227-530f-ad3b-cbe05b32c610"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "27a8969c-f227-530f-ad3b-cbe05b32c610",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "85829125-c180-5331-8d99-15ffb22db12e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "85829125-c180-5331-8d99-15ffb22db12e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1b139d05-18ff-53b3-b617-7ab042a062d4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1b139d05-18ff-53b3-b617-7ab042a062d4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1d6c75a6-efac-5ff1-a16c-891799d1075e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1d6c75a6-efac-5ff1-a16c-891799d1075e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:57.192 ************************************ 00:25:57.192 START TEST bdev_fio_trim 00:25:57.192 ************************************ 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:57.192 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:57.193 13:48:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:57.193 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:57.193 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:57.193 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:57.193 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:57.193 fio-3.35 00:25:57.193 Starting 4 threads 00:26:09.397 00:26:09.397 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=123311: Mon Jul 15 13:48:56 2024 00:26:09.397 write: IOPS=40.7k, BW=159MiB/s (167MB/s)(1588MiB/10001msec); 0 zone resets 00:26:09.397 slat (usec): min=12, max=1277, avg=56.47, stdev=29.76 00:26:09.397 clat (usec): min=28, max=1876, avg=248.25, stdev=156.47 00:26:09.397 lat (usec): min=56, max=1988, avg=304.72, stdev=174.68 00:26:09.397 clat percentiles (usec): 00:26:09.397 | 50.000th=[ 212], 99.000th=[ 766], 99.900th=[ 865], 99.990th=[ 963], 00:26:09.397 | 99.999th=[ 1369] 00:26:09.397 bw ( KiB/s): min=146496, max=219616, per=100.00%, avg=162879.16, stdev=4040.94, samples=76 00:26:09.397 iops : min=36624, max=54904, avg=40719.79, stdev=1010.23, samples=76 00:26:09.397 trim: IOPS=40.7k, BW=159MiB/s (167MB/s)(1588MiB/10001msec); 0 zone resets 00:26:09.397 slat (usec): min=4, max=101, avg=16.33, stdev= 6.46 00:26:09.397 clat (usec): min=40, max=1659, avg=234.38, stdev=104.10 00:26:09.397 lat (usec): min=45, max=1671, avg=250.72, stdev=105.81 00:26:09.397 clat percentiles (usec): 00:26:09.397 | 50.000th=[ 223], 99.000th=[ 529], 99.900th=[ 603], 99.990th=[ 725], 00:26:09.397 | 99.999th=[ 914] 00:26:09.397 bw ( KiB/s): min=146504, max=219640, per=100.00%, avg=162880.84, stdev=4041.87, samples=76 00:26:09.397 iops : min=36626, max=54910, avg=40720.21, stdev=1010.47, samples=76 00:26:09.397 lat (usec) : 50=0.33%, 100=9.07%, 250=51.73%, 500=33.67%, 750=4.60% 00:26:09.397 lat (usec) : 1000=0.60% 00:26:09.397 lat (msec) : 2=0.01% 00:26:09.397 cpu : usr=99.67%, sys=0.00%, ctx=61, majf=0, minf=106 00:26:09.397 IO depths : 1=7.8%, 2=26.3%, 4=52.7%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:09.397 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:09.397 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:09.397 issued rwts: total=0,406566,406567,0 short=0,0,0,0 dropped=0,0,0,0 00:26:09.397 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:09.397 00:26:09.397 Run status group 0 (all jobs): 00:26:09.397 WRITE: bw=159MiB/s (167MB/s), 159MiB/s-159MiB/s (167MB/s-167MB/s), io=1588MiB (1665MB), run=10001-10001msec 00:26:09.397 TRIM: bw=159MiB/s (167MB/s), 159MiB/s-159MiB/s (167MB/s-167MB/s), io=1588MiB (1665MB), run=10001-10001msec 00:26:09.397 00:26:09.397 real 0m13.395s 00:26:09.397 user 0m45.622s 00:26:09.397 sys 0m0.427s 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:26:09.397 ************************************ 00:26:09.397 END TEST bdev_fio_trim 00:26:09.397 ************************************ 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:26:09.397 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:26:09.397 00:26:09.397 real 0m27.208s 00:26:09.397 user 1m31.498s 00:26:09.397 sys 0m1.115s 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:09.397 ************************************ 00:26:09.397 END TEST bdev_fio 00:26:09.397 ************************************ 00:26:09.397 13:48:56 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:09.397 13:48:56 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:09.397 13:48:56 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:09.397 13:48:56 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:09.397 13:48:56 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:09.397 13:48:56 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:09.397 ************************************ 00:26:09.397 START TEST bdev_verify 00:26:09.397 ************************************ 00:26:09.397 13:48:56 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:09.397 [2024-07-15 13:48:56.645556] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:09.397 [2024-07-15 13:48:56.645606] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid124746 ] 00:26:09.397 [2024-07-15 13:48:56.733044] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:09.398 [2024-07-15 13:48:56.823813] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:09.398 [2024-07-15 13:48:56.823815] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:09.398 [2024-07-15 13:48:56.844879] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:09.398 [2024-07-15 13:48:56.852900] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:09.398 [2024-07-15 13:48:56.860921] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:09.398 [2024-07-15 13:48:56.957010] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:11.934 [2024-07-15 13:48:59.142730] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:11.934 [2024-07-15 13:48:59.142813] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:11.934 [2024-07-15 13:48:59.142824] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:11.934 [2024-07-15 13:48:59.150745] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:11.934 [2024-07-15 13:48:59.150759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:11.934 [2024-07-15 13:48:59.150766] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:11.934 [2024-07-15 13:48:59.158766] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:11.934 [2024-07-15 13:48:59.158779] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:11.934 [2024-07-15 13:48:59.158786] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:11.934 [2024-07-15 13:48:59.166788] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:11.934 [2024-07-15 13:48:59.166800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:11.934 [2024-07-15 13:48:59.166808] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:11.934 Running I/O for 5 seconds... 00:26:17.206 00:26:17.206 Latency(us) 00:26:17.206 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:17.206 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x0 length 0x1000 00:26:17.206 crypto_ram : 5.05 705.76 2.76 0.00 0.00 180894.46 3362.28 113975.65 00:26:17.206 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x1000 length 0x1000 00:26:17.206 crypto_ram : 5.06 708.59 2.77 0.00 0.00 180366.49 2991.86 113975.65 00:26:17.206 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x0 length 0x1000 00:26:17.206 crypto_ram2 : 5.06 708.74 2.77 0.00 0.00 179947.74 3704.21 103489.89 00:26:17.206 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x1000 length 0x1000 00:26:17.206 crypto_ram2 : 5.06 708.69 2.77 0.00 0.00 179954.78 3034.60 103489.89 00:26:17.206 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x0 length 0x1000 00:26:17.206 crypto_ram3 : 5.04 5514.67 21.54 0.00 0.00 23038.89 5442.34 17324.30 00:26:17.206 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x1000 length 0x1000 00:26:17.206 crypto_ram3 : 5.05 5528.87 21.60 0.00 0.00 22995.27 3590.23 17666.23 00:26:17.206 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x0 length 0x1000 00:26:17.206 crypto_ram4 : 5.05 5529.23 21.60 0.00 0.00 22960.07 2849.39 17438.27 00:26:17.206 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:17.206 Verification LBA range: start 0x1000 length 0x1000 00:26:17.206 crypto_ram4 : 5.05 5528.66 21.60 0.00 0.00 22956.19 3932.16 17210.32 00:26:17.206 =================================================================================================================== 00:26:17.206 Total : 24933.21 97.40 0.00 0.00 40891.07 2849.39 113975.65 00:26:17.206 00:26:17.206 real 0m8.097s 00:26:17.206 user 0m15.456s 00:26:17.206 sys 0m0.338s 00:26:17.206 13:49:04 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:17.206 13:49:04 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:26:17.206 ************************************ 00:26:17.206 END TEST bdev_verify 00:26:17.206 ************************************ 00:26:17.206 13:49:04 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:17.206 13:49:04 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:17.206 13:49:04 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:17.206 13:49:04 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:17.206 13:49:04 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:17.206 ************************************ 00:26:17.206 START TEST bdev_verify_big_io 00:26:17.206 ************************************ 00:26:17.206 13:49:04 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:17.465 [2024-07-15 13:49:04.825886] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:17.465 [2024-07-15 13:49:04.825931] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid125812 ] 00:26:17.465 [2024-07-15 13:49:04.909892] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:17.465 [2024-07-15 13:49:04.990866] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:17.465 [2024-07-15 13:49:04.990869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:17.466 [2024-07-15 13:49:05.011895] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:17.466 [2024-07-15 13:49:05.019919] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:17.466 [2024-07-15 13:49:05.027939] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:17.725 [2024-07-15 13:49:05.122705] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:20.260 [2024-07-15 13:49:07.310408] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:20.260 [2024-07-15 13:49:07.310468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:20.260 [2024-07-15 13:49:07.310494] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:20.260 [2024-07-15 13:49:07.318427] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:20.260 [2024-07-15 13:49:07.318442] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:20.260 [2024-07-15 13:49:07.318450] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:20.260 [2024-07-15 13:49:07.326450] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:20.260 [2024-07-15 13:49:07.326465] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:20.260 [2024-07-15 13:49:07.326472] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:20.260 [2024-07-15 13:49:07.334472] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:20.260 [2024-07-15 13:49:07.334487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:20.260 [2024-07-15 13:49:07.334495] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:20.260 Running I/O for 5 seconds... 00:26:20.521 [2024-07-15 13:49:07.986411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:20.521 [2024-07-15 13:49:07.986705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:20.521 [2024-07-15 13:49:07.986831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:20.521 [2024-07-15 13:49:07.986878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:20.521 [2024-07-15 13:49:07.986928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:20.521 [2024-07-15 13:49:07.987244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.988966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.989756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.989806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.989835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.989863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.990234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.990279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.990319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.990348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.990632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.991971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.992203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.993890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.994692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.994729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.994758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.994785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.521 [2024-07-15 13:49:07.995110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.995154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.995194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.995233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.995448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.996782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.997041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.997815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.997850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.997891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.997918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.998322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.998364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.998406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.998434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.998684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.999484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.999530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.999571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.999601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:07.999951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.000002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.000032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.000059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.000399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.001757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.002014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.002891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.002936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.002979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.003011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.003351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.003383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.003411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.003438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.003699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.004465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.004516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.004545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.004573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.004940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.004971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.005016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.005045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.005341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.007894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.008581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.008617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.008648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.008675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.009042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.009087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.009115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.009142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.009415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.010819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.011077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.011788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.011836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.011872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.011900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.012273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.012306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.012345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.012373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.012656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.013471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.013521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.013551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.522 [2024-07-15 13:49:08.013578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.013921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.013956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.013984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.014016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.014252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.015857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.016647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.016700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.016729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.016757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.017113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.017158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.017188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.017233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.017471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.018793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.019038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.019971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.020626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.021930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.022839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.022876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.022908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.022934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.023209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.023242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.023269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.023296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.023483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.024779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.025833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.025868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.025895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.026228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.026261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.026294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.027627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.028569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.028606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.028637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.028665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.029088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.029122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.029157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.029194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.523 [2024-07-15 13:49:08.030467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.031800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.032844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.032884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.032916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.032943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.033298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.033331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.033359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.033385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.034784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.035838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.035874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.035902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.035928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.036205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.036241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.036269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.036295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.037158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.037727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.038704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.039724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.040010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.041044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.041302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.041569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.043450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.044047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.045071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.046134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.047624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.047902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.048170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.049153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.050871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.052139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.053363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.054524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.055220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.055486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.056131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.057103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.058779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.059784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.060775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.061765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.062349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.062787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.063744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.064757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.066501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.067517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.068520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.069372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.070047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.071260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.072480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.073700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.075489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.076495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.077449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.077720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.079413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.080646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.081877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.083124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.084904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.085935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.086217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.086487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.087741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.088732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.089763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.090308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.092248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.092521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.092782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.093684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.094976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.095984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.096698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.097834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.099131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.099430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.524 [2024-07-15 13:49:08.100053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.101036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.102312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.103318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.104465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.105493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.106578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.106870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.107913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.109010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.110478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.111405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.112386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.113385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.114681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.115844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.117015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.118172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.119126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.120092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.121087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.122102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.124374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.125609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.126834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.128141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.129390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.130397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.131421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.132169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.134519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.135613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.136644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.525 [2024-07-15 13:49:08.137187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.138470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.139551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.140676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.140931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.142804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.143808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.144498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.145645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.146941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.147977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.148248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.148505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.150387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.151491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.152496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.153486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.154810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.155377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.155636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.156074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.157944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.158674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.159653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.160665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.161721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.161985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.162249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.163285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.164651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.165647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.166632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.167663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.168195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.168477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.169727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.170912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.172762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.173800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.174862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.787 [2024-07-15 13:49:08.175956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.176585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.177762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.178931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.180092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.181986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.183035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.184113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.184370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.185841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.186940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.188062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.189199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.191113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.192212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.192472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.192731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.193932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.194936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.195951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.196477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.198298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.198603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.198869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.199626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.200954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.201946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.202757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.203992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.205510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.205799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.206244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.207201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.208628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.209822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.210673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.211625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.212692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.212976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.214202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.215335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.216612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.217180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.218130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.219116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.220158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.220969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.221965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.222980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.223944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.225073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.226185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.227288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.228951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.229928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.230878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.231789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.233130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.234243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.235376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.236464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.238521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.238816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.239817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.240137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.240760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.241044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.241324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.241594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.242878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.243159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.243431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.243705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.244294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.244567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.244837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.245111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.246770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.247049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.247315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.247583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.248167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.248440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.248707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.248986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.250707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.250985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.251253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.251518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.252076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.252366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.252633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.252898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.254214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.254489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.254759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.255029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.255654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.255929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.256208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.256473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.257737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.258022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.258294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.258563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.259176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.259449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.259718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.259987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.261216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.261514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.261788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.261816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.262094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.262372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.262717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.262985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.263255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.263527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.264452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.264503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.264543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.264571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.264790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.265139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.265181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.265211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.265239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.266976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.268564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.269597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.269635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.269664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.269691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.270006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.270120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.270167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.270196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.270245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.788 [2024-07-15 13:49:08.271816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.272796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.272833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.272862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.272890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.273141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.273252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.273283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.273311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.273353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.274573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.274620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.274649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.274688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.274988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.275106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.275148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.275194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.275233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.276730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.277905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.277951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.277988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.278024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.278292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.278400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.278433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.278461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.278490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.279456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.279505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.279544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.279573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.279892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.280005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.280050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.280079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.280108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.281777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.282644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.282693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.282722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.282758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.282937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.283057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.283101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.283138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.283165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.284565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.285523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.285561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.285590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.285618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.285831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.285938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.285969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.286002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.286035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.286898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.286939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.286967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.286993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.287177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.287282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.287313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.287341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.287367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.288759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.289593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.289629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.289659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.289695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.289902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.290013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.290048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.290075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.290102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.290926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.290962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.291001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.291031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.291290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.291396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.291426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.291466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.291493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.292883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.293753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.293791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.293823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.293853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.294039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.294146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.294178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.294205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.294231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.295805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.296663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.296699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.296726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.296753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.296960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.297078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.297110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.297137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.297169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.298668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.299449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.789 [2024-07-15 13:49:08.299493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.299521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.299551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.299737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.299845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.299876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.299909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.299936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.300731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.300776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.300804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.300831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.301033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.301144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.301180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.301209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.301236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.302652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.303993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.304025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.305478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.306773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.307535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.307578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.307608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.307634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.307951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.308071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.308103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.308132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.308159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.308960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.309425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.310264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.310301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.311526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.311667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.311775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.311805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.311832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.311869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.313127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.314407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.315632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.316772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.316961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.317093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.318066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.319093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.320283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.322372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.323393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.324604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.325438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.325621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.326664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.327646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.328800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.329065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.330942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.332120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.333010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.334085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.334319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.335426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.336630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.336896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.337154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.339109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.339934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.341066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.342170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.342389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.343643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.343917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.344176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.344942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.346679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.347748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.348789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.349821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.350013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.350358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.350626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.351319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.352289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.353796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.354785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.355803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.356979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.357226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.357561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.357978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.358977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.359964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.361725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.362736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.363956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.364462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.364758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.365177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.366127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.367131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.368302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.790 [2024-07-15 13:49:08.370088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.371312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.371926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.372189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.372476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.373616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.374689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.375853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.376855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.378817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.379620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.379897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.380164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.380351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.381639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.382865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.384012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.384798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.386484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.386748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.387014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.388236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.388416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.389522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.390752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.391394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.392374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.393416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.393680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.394848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.396009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.396211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.397492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.398086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.399011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.399959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.401045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:20.791 [2024-07-15 13:49:08.401973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.402916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.403889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.404079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.404661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.405791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.406966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.408167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.409678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.410653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.411640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.412812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.413046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.414387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.415620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.416852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.418013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.419848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.420864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.422052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.422481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.422663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.423964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.425176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.426164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.426424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.428353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.429544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.429981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.431078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.431262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.432564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.433671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.053 [2024-07-15 13:49:08.433929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.434186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.436144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.436801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.438026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.439258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.439442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.440732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.441000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.441257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.442166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.443750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.444939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.446130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.447306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.447489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.447828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.448091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.448871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.449843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.451612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.452638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.453633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.454813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.455088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.455422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.456092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.457053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.458039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.459714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.460693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.461894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.462250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.462492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.463098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.464092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.465088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.466266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.468088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.469275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.469732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.469993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.470230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.471266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.472283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.473485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.474248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.476257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.476761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.477023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.477284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.477470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.478670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.478950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.480085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.481254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.482998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.483931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.484775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.485829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.486058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.487064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.487549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.487806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.488068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.489246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.489534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.489804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.490080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.490337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.490681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.490955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.491229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.491496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.492741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.493016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.493288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.493560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.493867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.494219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.494490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.494761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.495031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.496220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.496491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.496758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.497032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.497316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.497665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.497940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.498210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.498556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.499888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.500173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.500440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.500703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.500927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.501279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.501555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.501822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.054 [2024-07-15 13:49:08.502095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.503475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.503753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.504032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.504301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.504623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.504969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.505248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.505515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.505778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.507022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.507293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.507573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.507854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.508162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.508508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.508781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.509058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.509323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.510634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.510907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.511182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.511453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.511729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.512085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.512359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.512627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.512907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.514113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.514388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.514656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.514926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.515199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.515544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.515811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.516084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.516353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.517524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.517801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.518074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.518342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.518600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.518946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.519223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.519492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.519774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.521317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.521612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.521879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.521913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.522260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.522608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.522882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.523155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.523419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.524659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.524701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.524731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.524760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.525028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.526321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.526378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.526406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.526434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.527938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.528898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.528937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.528970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.529005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.529193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.529306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.529349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.529392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.529431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.530843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.531728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.531766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.531793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.531820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.532051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.532160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.532195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.055 [2024-07-15 13:49:08.532222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.532253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.533588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.534555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.534591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.534619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.534646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.534861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.534973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.535010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.535044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.535072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.535917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.535954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.535981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.536015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.536227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.536338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.536369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.536403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.536431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.537927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.538728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.538768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.538800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.538826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.539010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.539125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.539168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.539196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.539222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.539993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.540492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.541957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.542857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.542898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.542927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.542954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.543191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.543303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.543338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.543367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.543394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.544904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.545763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.545800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.545827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.545866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.546067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.546175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.546206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.546234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.546268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.056 [2024-07-15 13:49:08.547559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.547594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.548550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.548588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.548626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.548654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.548839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.548949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.548990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.549025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.549052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.549837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.549874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.549918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.549953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.550143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.550256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.550289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.550322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.550350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.551659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.552962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.553744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.553782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.553811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.553850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.554218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.554331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.554366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.554396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.554424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.555731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.556537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.556581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.556609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.556637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.556839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.556948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.556983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.557018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.557046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.558816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.559664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.559705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.559734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.559761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.560014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.560126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.560161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.560190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.560217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.561065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.057 [2024-07-15 13:49:08.561103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.561132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.561160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.561377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.561492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.561524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.561555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.561583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.562971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.563787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.563824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.563853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.563880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.564108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.564220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.564252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.564286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.564317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.565701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.566520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.566557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.566584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.566611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.566836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.566945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.566983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.567018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.567050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.567846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.567882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.567912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.567939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.568256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.568366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.568398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.568432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.568460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.569286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.569341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.569373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.570369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.570550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.570661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.570700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.570729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.570760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.571571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.571841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.572130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.572877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.573102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.573225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.574216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.575204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.575805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.577641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.577911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.578178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.578781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.579016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.580129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.581160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.582029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.583245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.584619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.584910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.585352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.586317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.586498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.587652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.588713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.589783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.590770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.591868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.592162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.593305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.594488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.594672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.595967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.596824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.597777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.598768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.600042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.601293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.058 [2024-07-15 13:49:08.602507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.603658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.603872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.604492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.605482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.606488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.607504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.609588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.610723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.611791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.612853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.613072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.614133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.615114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.616078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.617038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.619018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.620013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.620974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.621528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.621711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.622921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.624043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.625212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.625470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.627374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.628390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.629149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.630338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.630526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.631822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.633089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.633363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.633623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.635424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.636318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.637551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.638707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.638905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.640030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.640332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.640601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.641229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.643214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.644251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.645214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.646212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.646427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.647095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.647362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.647714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.648661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.650062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.651050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.652082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.653065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.653260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.653611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.653879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.655159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.656373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.658216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.659233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.660246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.661256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.661588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.661924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.663108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.664195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.665193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.667262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.059 [2024-07-15 13:49:08.668484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.669750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.670028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.670263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.671213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.672181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.673185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.674168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.676033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.677056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.677346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.677609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.677809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.322 [2024-07-15 13:49:08.678874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.679840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.680814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.681827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.683690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.684339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.684609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.684900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.685098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.686335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.687415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.688541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.689586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.691255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.691524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.691784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.692872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.693067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.694432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.695664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.696363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.697310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.698398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.698676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.699668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.700617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.700842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.701477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.702739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.703953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.705087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.706920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.707500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.708107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.708810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.709153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.709499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.710778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.711548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.712100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.713276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.713570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.713840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.714118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.714372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.714719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.714993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.715270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.715544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.717135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.717407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.717670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.717934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.718190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.718540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.718814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.719091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.719362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.720710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.720985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.721256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.721518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.721707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.722055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.722329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.722591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.722850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.724101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.724398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.724664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.724931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.725168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.725505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.725775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.726047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.323 [2024-07-15 13:49:08.726311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.727640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.727932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.728218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.728494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.728764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.729112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.729379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.729642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.729902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.731133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.731427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.731697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.731969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.732326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.732666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.732942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.733226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.733488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.734701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.734974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.735268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.735541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.735879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.736242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.736518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.736790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.737066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.738344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.738613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.738888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.739160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.739511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.739878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.740161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.740430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.740700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.741910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.742194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.742468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.742737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.743014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.743362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.743636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.743907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.744185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.745351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.745630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.745902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.746177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.746479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.746823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.747104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.747377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.747644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.748953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.749240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.749511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.749783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.750137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.750486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.750759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.751037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.751308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.753323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.754022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.755264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.756404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.756625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.757866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.758150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.758420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.758929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.760902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.324 [2024-07-15 13:49:08.762000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.762272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.762542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.762797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.764141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.764882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.765481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.765750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.767817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.768119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.769312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.769350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.769538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.770757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.771051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.771327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.772200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.773741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.773798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.773831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.773859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.774051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.775167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.775207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.775236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.775263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.776741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.777651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.777691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.777727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.777757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.777966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.778085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.778119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.778160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.778195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.779632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.780570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.780613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.780644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.780671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.780850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.780961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.781001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.781029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.781056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.781925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.781965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.781993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.782027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.782208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.782316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.782347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.782374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.782400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.783459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.783495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.783527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.783554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.325 [2024-07-15 13:49:08.783773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.783884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.783917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.783945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.783980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.784891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.784928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.784957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.784985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.785221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.785331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.785362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.785396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.785423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.786944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.787747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.787784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.787812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.787839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.788087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.788213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.788250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.788278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.788305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.326 [2024-07-15 13:49:08.789737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.789765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.790596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.790632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.790660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.790687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.790924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.791048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.349 [2024-07-15 13:49:08.791080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.791115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.791146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.792559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.793918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.794742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.794779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.794807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.794840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.795024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.795137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.795168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.795195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.795223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.796620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.797972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.798007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.798898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.798933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.798961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.799014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.799194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.799303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.799333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.799362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.799407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.800535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.800573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.800605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.800648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.800831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.800946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.800978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.801012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.801040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.801885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.801926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.801955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.801982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.802176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.802289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.802321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.350 [2024-07-15 13:49:08.802352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.802379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.803922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.804765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.804807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.804839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.804866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.805067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.805179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.805211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.805246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.805274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.806721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.807591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.807630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.807658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.807684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.807897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.808010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.808042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.808076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.808106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.808956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.808991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.809025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.809068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.809285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.809395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.809432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.809470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.809498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.351 [2024-07-15 13:49:08.810832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.811656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.811693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.811721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.811757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.811935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.812062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.812095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.812123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.812149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.812980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.813548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.814889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.815758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.815795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.815824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.815851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.816042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.816154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.816193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.816222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.816254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.817721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.818537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.818575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.818609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.819833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.820034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.820148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.820184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.820212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.820240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.821229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.822463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.823726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.824941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.825131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.825243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.826080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.827044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.828064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.829388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.830672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.831950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.833178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.833365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.834133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.835106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.836132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.352 [2024-07-15 13:49:08.837338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.840110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.841263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.842509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.843524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.843714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.844753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.845792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.847015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.847340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.849491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.850671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.851694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.852622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.852842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.853921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.855104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.855608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.855875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.857872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.859132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.859854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.860831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.861050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.862342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.862893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.863183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.863451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.865525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.866077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.867048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.868046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.868232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.869226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.869502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.869780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.870867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.872211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.873355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.874509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.875686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.875871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.876236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.876506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.877336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.878278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.880248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.881428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.882548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.883785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.884059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.884406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.885090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.886069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.887043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.888803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.889817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.890979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.891246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.891475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.891992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.892969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.893941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.895095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.896879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.898062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.898643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.898906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.899239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.900556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.901792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.903083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.904222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.353 [2024-07-15 13:49:08.906254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.906901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.907170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.907434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.907616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.908842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.910019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.911215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.911968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.913613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.913893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.914174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.915413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.915598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.916740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.918024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.918638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.919590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.920640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.920912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.921864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.922831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.923054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.924343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.924662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.925732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.926863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.928008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.928279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.928552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.928826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.929146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.929481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.930503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.931508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.931812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.933016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.933291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.934401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.935631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.935830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.354 [2024-07-15 13:49:08.936184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.937357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.938658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.939822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.941595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.941887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.942186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.942458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.942709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.943058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.943324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.943586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.943846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.945376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.945655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.945920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.946187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.946416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.946764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.947038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.947300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.947561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.948856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.949143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.949407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.949668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.949910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.950272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.950547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.950817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.951094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.952596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.952882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.953152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.953418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.953638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.953979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.954250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.954514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.954797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.956147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.956418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.956683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.956948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.957139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.957476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.957747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.958010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.958269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.959581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.959854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.960122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.960927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.961166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.962210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.962506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.962766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.963032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.964542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.964811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.965081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.965424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.965608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.966160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.967011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.967271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.967536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.969787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.970081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.970353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.970627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.970813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.971159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.972364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.972622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.972884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.974881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.975162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.975432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.975703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.975927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.976528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.977497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.977859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.978134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.979777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.980349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.980619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.618 [2024-07-15 13:49:08.980892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.981154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.982077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.982809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.983377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.983637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.985033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.985819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.986087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.986351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.986632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.987738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.988254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.989033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.989294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.990577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.991520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.991788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.992063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.992342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.993698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.993975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.995081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.995343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.996583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.997856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.998122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.998872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.999067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:08.999691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.000553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.001696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.002313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.003573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.004536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.005540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.006622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.006844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.007871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.008852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.009737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.010004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.011525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.011803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.012080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.012349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.012658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.013010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.013797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.014777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.015942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.017787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.018968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.020133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.020699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.021031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.021383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.022591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.023822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.024987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.027013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.028215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.028479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.028511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.028741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.029353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.030369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.031569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.032777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.034876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.034922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.034950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.034977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.035179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.035517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.035553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.035588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.035616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.036985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.037024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.037904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.037939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.037967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.037998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.038183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.038294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.038331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.038361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.038388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.619 [2024-07-15 13:49:09.039287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.039775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.040593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.040635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.040664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.040697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.040880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.040998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.041033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.041061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.041088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.042742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.043562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.043602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.043630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.043658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.043841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.043953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.043989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.044029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.044057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.044871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.044907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.044936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.044964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.045291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.045409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.045440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.045468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.045495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.046822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.047690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.047726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.047754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.047782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.047969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.048090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.048124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.048153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.048182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.049646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.050973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.052066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.052102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.620 [2024-07-15 13:49:09.052132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.052162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.052345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.052452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.052485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.052519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.052547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.053892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.054663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.054700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.054729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.054756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.055102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.055218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.055251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.055279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.055307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.056656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.059695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.059742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.059770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.059797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.060063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.060172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.060205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.060233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.060264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.063741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.066792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.070567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.072666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.072703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.072730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.072763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.072946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.621 [2024-07-15 13:49:09.073069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.073106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.073134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.073162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.076616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.079612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.079652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.079680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.079708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.079945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.080062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.080093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.080120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.080147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.082728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.082782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.082809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.082836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.083069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.083179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.083219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.083246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.083272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.086579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.086616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.086643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.086670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.086849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.086954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.086986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.087018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.087052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.089810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.092802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.092838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.092873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.092901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.093175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.093284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.093314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.093341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.093369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.096804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.622 [2024-07-15 13:49:09.132059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.132115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.133021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.133067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.134244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.134438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.135063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.136343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.137598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.138889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.140347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.141348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.142547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.143753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.144010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.145319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.146577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.147797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.148730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.150569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.151754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.152932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.153220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.153426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.154720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.155889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.156797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.157062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.159184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.160379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.622 [2024-07-15 13:49:09.160651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.161713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.161904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.163178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.164077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.164341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.164601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.166598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.166949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.168107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.169281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.169468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.170556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.170830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.171102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.172033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.173414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.174635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.175809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.177061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.177250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.177592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.177858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.178683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.179652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.181377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.182564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.182849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.183122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.183405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.183756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.184051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.184337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.184605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.185922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.187174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.188359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.189591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.189878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.190237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.190818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.191787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.192937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.194905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.195479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.195759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.196026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.196249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.196805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.197435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.198071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.198334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.199534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.199809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.200082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.200347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.200657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.201013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.201291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.201567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.201864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.203295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.203583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.203851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.204135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.204366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.204724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.204990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.205258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.205526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.206849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.207130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.207403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.207662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.207945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.208297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.208566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.208839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.209108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.210312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.210587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.210855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.211124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.211337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.211687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.211965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.212239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.212497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.213689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.213962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.214255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.214526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.214772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.215139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.215424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.215698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.623 [2024-07-15 13:49:09.215961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.217135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.217419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.217683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.217950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.218224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.218566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.218836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.219116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.219381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.220570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.220846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.221113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.221380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.221656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.222001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.222278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.222544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.222812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.223921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.224203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.224468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.224737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.225043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.225397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.225666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.225930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.226204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.227568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.227844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.228137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.228406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.228637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.228992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.229273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.229542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.229812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.231151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.231435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.231711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.231989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.624 [2024-07-15 13:49:09.232320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.233363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.234519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.234796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.235980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.237446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.238430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.239645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.240272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.240464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.241527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.242687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.242972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.243242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.245050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.245415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.886 [2024-07-15 13:49:09.245693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.245963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.246178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.247412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.247695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.248841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.250107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.252028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.253112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.253938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.254791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.255047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.255406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.255691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.256829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.258091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.260260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.261308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.261574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.261836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.262030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.263071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.264268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.265509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.266048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.267820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.268092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.268355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.269420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.269658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.270952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.272190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.272731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.273706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.274721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.274991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.276138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.277263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.277453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.278676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.279350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.280303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.281493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:26:21.887 [2024-07-15 13:49:09.282777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.282819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.283815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.283847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.284037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.285336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.285933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.287066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.288176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.289284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.289326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.289952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.289982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.290228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.291467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.292627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.292661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.292920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.294910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.294952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.295214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.295245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.295537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.295652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.296850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.296889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.298154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.300088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.300130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.301309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.301341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.301525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.301642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.301971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.302008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.302263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.304301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.304358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.305100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.305131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.305316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.305440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.306671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.306707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.307885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.309508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.309549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.310508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.310540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.310724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.310847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.311895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.311930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.312794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.314223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.887 [2024-07-15 13:49:09.314272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.314528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.314559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.314798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.314916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.315889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.315922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.315949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.317712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.317754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.318901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.318940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.319130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.319247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.319280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.319307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.319337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.320517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.320557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.320585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.320612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.320827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.320941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.320971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.321003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.321031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.321852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.321890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.321918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.321946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.322185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.322301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.322332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.322360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.322386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.323749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.324560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.324605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.324633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.324664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.324853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.324965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.325001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.325030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.325069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.325851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.325892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.325920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.325947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.326154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.326268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.326313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.326341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.326368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.327854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.328752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.328792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.328819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.328846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.329034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.329153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.329184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.329214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.329241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.330752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.331985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.332022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.888 [2024-07-15 13:49:09.332805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.332842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.332870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.332898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.333244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.333364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.333395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.333422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.333450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.334745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.335572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.335609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.335638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.335665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.335848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.335963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.336000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.336036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.336066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.336929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.336965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.336993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.337025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.337260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.337373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.337411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.337442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.337468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.338831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.339659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.339696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.339737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.339765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.340064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.340185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.340215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.340243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.340270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.341713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.342548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.342584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.342612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.342640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.342933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.343056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.343088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.343117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.343148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.344591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.345946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.346901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.346938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.346972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.347004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.889 [2024-07-15 13:49:09.347186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.347297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.347328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.347356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.347385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.348627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.349922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.350754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.350792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.350820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.350853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.351044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.351171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.351204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.351233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.352178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.353554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.354542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.355736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.355786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.357028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.357286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.357400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.357433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.357460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.357487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.358292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.358698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.358730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.358986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.359240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.359352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.359384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.360355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.361080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.361876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.361908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.362888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.363081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.363196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.364422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.364456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.364711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.365543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.366731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.366783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.367950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.368227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.368348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.369391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.369424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.370591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.371464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.371731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.371763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.372868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.373107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.373221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.374449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.374483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.375372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.376141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.377330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.377363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.377728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.378030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.378141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.890 [2024-07-15 13:49:09.378577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.378609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.379594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.380357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.381358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.381391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.382505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.382689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.382804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.383798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.384722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.385690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.385794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.385908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.387084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.387940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.388762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.389558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.390114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.390374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.390631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.390816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.392109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.393313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.394274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.394975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.396551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.396819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.397092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.398255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.398467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.398824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.400054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.401306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.401603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.402956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.403865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.404256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.404520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.404803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.405935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.407150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.408141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.408852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.409880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.410149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.411033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.411998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.412240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.413308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.414289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.414594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.414853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.416263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.417106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.418181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.418579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.418921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.419272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.419543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.419806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.420081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.421328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.421593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.421852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.422118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.422406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.422747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.423017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.423283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.423547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.424983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.425254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.891 [2024-07-15 13:49:09.425513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.425772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.426014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.426360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.426623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.426883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.427170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.428524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.428793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.429058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.429316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.429531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.429869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.430146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.430405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.430666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.431929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.432208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.432471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.432731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.432992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.433339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.433609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.433872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.434133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.435312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.435590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.435855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.436123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.436405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.436744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.437014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.437279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.437542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.438772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.439044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.439303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.439562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.439781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.440125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.440388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.440648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.440908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.442482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.442751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.443014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.443270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.443484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.443824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.444092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.444349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.444607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.446056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.446350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.446621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.446886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.447084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:21.892 [2024-07-15 13:49:09.447435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:26:26.085 00:26:26.085 Latency(us) 00:26:26.085 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:26.085 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x0 length 0x100 00:26:26.085 crypto_ram : 5.62 67.24 4.20 0.00 0.00 1859529.35 55392.17 1663132.72 00:26:26.085 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x100 length 0x100 00:26:26.085 crypto_ram : 5.64 64.89 4.06 0.00 0.00 1913257.89 37156.06 1743371.58 00:26:26.085 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x0 length 0x100 00:26:26.085 crypto_ram2 : 5.62 67.57 4.22 0.00 0.00 1806486.79 54024.46 1663132.72 00:26:26.085 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x100 length 0x100 00:26:26.085 crypto_ram2 : 5.65 66.77 4.17 0.00 0.00 1825370.41 25188.62 1721488.25 00:26:26.085 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x0 length 0x100 00:26:26.085 crypto_ram3 : 5.42 444.61 27.79 0.00 0.00 266083.59 43310.75 434019.28 00:26:26.085 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x100 length 0x100 00:26:26.085 crypto_ram3 : 5.41 426.21 26.64 0.00 0.00 276250.87 22339.23 386605.41 00:26:26.085 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x0 length 0x100 00:26:26.085 crypto_ram4 : 5.47 459.59 28.72 0.00 0.00 252373.51 15272.74 434019.28 00:26:26.085 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:26.085 Verification LBA range: start 0x100 length 0x100 00:26:26.085 crypto_ram4 : 5.50 445.86 27.87 0.00 0.00 258652.43 2521.71 362898.48 00:26:26.085 =================================================================================================================== 00:26:26.085 Total : 2042.72 127.67 0.00 0.00 476187.13 2521.71 1743371.58 00:26:26.085 00:26:26.085 real 0m8.732s 00:26:26.085 user 0m16.685s 00:26:26.085 sys 0m0.383s 00:26:26.085 13:49:13 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:26.085 13:49:13 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:26:26.085 ************************************ 00:26:26.085 END TEST bdev_verify_big_io 00:26:26.085 ************************************ 00:26:26.085 13:49:13 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:26.085 13:49:13 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:26.085 13:49:13 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:26.085 13:49:13 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:26.085 13:49:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:26.085 ************************************ 00:26:26.085 START TEST bdev_write_zeroes 00:26:26.085 ************************************ 00:26:26.085 13:49:13 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:26.085 [2024-07-15 13:49:13.611725] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:26.085 [2024-07-15 13:49:13.611768] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid126883 ] 00:26:26.085 [2024-07-15 13:49:13.695043] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:26.344 [2024-07-15 13:49:13.782071] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:26.344 [2024-07-15 13:49:13.802939] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:26.344 [2024-07-15 13:49:13.810962] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:26.344 [2024-07-15 13:49:13.818988] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:26.344 [2024-07-15 13:49:13.913895] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:28.879 [2024-07-15 13:49:16.100168] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:28.879 [2024-07-15 13:49:16.100225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:28.879 [2024-07-15 13:49:16.100235] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:28.879 [2024-07-15 13:49:16.108188] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:28.879 [2024-07-15 13:49:16.108201] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:28.879 [2024-07-15 13:49:16.108209] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:28.879 [2024-07-15 13:49:16.116206] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:28.879 [2024-07-15 13:49:16.116218] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:28.879 [2024-07-15 13:49:16.116225] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:28.879 [2024-07-15 13:49:16.124225] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:28.879 [2024-07-15 13:49:16.124236] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:28.879 [2024-07-15 13:49:16.124244] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:28.879 Running I/O for 1 seconds... 00:26:29.815 00:26:29.815 Latency(us) 00:26:29.815 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:29.815 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:29.815 crypto_ram : 1.02 3023.05 11.81 0.00 0.00 42127.72 3519.00 48781.58 00:26:29.815 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:29.815 crypto_ram2 : 1.02 3036.55 11.86 0.00 0.00 41819.60 3462.01 45818.21 00:26:29.815 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:29.815 crypto_ram3 : 1.01 23570.25 92.07 0.00 0.00 5377.78 1617.03 6810.05 00:26:29.815 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:29.815 crypto_ram4 : 1.01 23608.12 92.22 0.00 0.00 5358.92 1560.04 5955.23 00:26:29.815 =================================================================================================================== 00:26:29.815 Total : 53237.97 207.96 0.00 0.00 9546.37 1560.04 48781.58 00:26:30.074 00:26:30.074 real 0m4.036s 00:26:30.074 user 0m3.698s 00:26:30.074 sys 0m0.303s 00:26:30.074 13:49:17 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:30.074 13:49:17 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:26:30.074 ************************************ 00:26:30.074 END TEST bdev_write_zeroes 00:26:30.074 ************************************ 00:26:30.074 13:49:17 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:30.074 13:49:17 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:30.074 13:49:17 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:30.074 13:49:17 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:30.074 13:49:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:30.074 ************************************ 00:26:30.074 START TEST bdev_json_nonenclosed 00:26:30.074 ************************************ 00:26:30.074 13:49:17 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:30.363 [2024-07-15 13:49:17.729081] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:30.363 [2024-07-15 13:49:17.729124] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid127427 ] 00:26:30.363 [2024-07-15 13:49:17.811123] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:30.363 [2024-07-15 13:49:17.891192] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:30.363 [2024-07-15 13:49:17.891262] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:26:30.363 [2024-07-15 13:49:17.891278] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:30.363 [2024-07-15 13:49:17.891286] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:30.647 00:26:30.647 real 0m0.298s 00:26:30.647 user 0m0.186s 00:26:30.647 sys 0m0.110s 00:26:30.647 13:49:17 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:26:30.647 13:49:17 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:30.647 13:49:17 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:26:30.647 ************************************ 00:26:30.647 END TEST bdev_json_nonenclosed 00:26:30.647 ************************************ 00:26:30.647 13:49:18 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:26:30.647 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:26:30.647 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:30.647 13:49:18 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:30.647 13:49:18 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:30.647 13:49:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:30.647 ************************************ 00:26:30.647 START TEST bdev_json_nonarray 00:26:30.647 ************************************ 00:26:30.647 13:49:18 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:30.647 [2024-07-15 13:49:18.093613] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:30.647 [2024-07-15 13:49:18.093658] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid127450 ] 00:26:30.647 [2024-07-15 13:49:18.173182] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:30.647 [2024-07-15 13:49:18.254747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:30.647 [2024-07-15 13:49:18.254824] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:26:30.647 [2024-07-15 13:49:18.254839] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:30.647 [2024-07-15 13:49:18.254848] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:30.906 00:26:30.906 real 0m0.283s 00:26:30.906 user 0m0.178s 00:26:30.906 sys 0m0.104s 00:26:30.906 13:49:18 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:26:30.906 13:49:18 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:30.906 13:49:18 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:26:30.906 ************************************ 00:26:30.906 END TEST bdev_json_nonarray 00:26:30.906 ************************************ 00:26:30.906 13:49:18 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:26:30.906 13:49:18 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:26:30.906 00:26:30.907 real 1m8.382s 00:26:30.907 user 2m34.792s 00:26:30.907 sys 0m7.558s 00:26:30.907 13:49:18 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:30.907 13:49:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:30.907 ************************************ 00:26:30.907 END TEST blockdev_crypto_aesni 00:26:30.907 ************************************ 00:26:30.907 13:49:18 -- common/autotest_common.sh@1142 -- # return 0 00:26:30.907 13:49:18 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:26:30.907 13:49:18 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:30.907 13:49:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:30.907 13:49:18 -- common/autotest_common.sh@10 -- # set +x 00:26:30.907 ************************************ 00:26:30.907 START TEST blockdev_crypto_sw 00:26:30.907 ************************************ 00:26:30.907 13:49:18 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:26:31.166 * Looking for test storage... 00:26:31.166 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=127682 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 127682 00:26:31.166 13:49:18 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:26:31.166 13:49:18 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 127682 ']' 00:26:31.166 13:49:18 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:31.166 13:49:18 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:31.166 13:49:18 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:31.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:31.166 13:49:18 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:31.166 13:49:18 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:31.166 [2024-07-15 13:49:18.638807] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:31.166 [2024-07-15 13:49:18.638872] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid127682 ] 00:26:31.166 [2024-07-15 13:49:18.725296] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:31.425 [2024-07-15 13:49:18.807433] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:31.991 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:31.991 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:26:31.991 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:26:31.991 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:26:31.991 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:26:31.991 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:31.991 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:32.249 Malloc0 00:26:32.249 Malloc1 00:26:32.249 true 00:26:32.249 true 00:26:32.249 true 00:26:32.249 [2024-07-15 13:49:19.694586] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:32.249 crypto_ram 00:26:32.249 [2024-07-15 13:49:19.702615] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:32.249 crypto_ram2 00:26:32.249 [2024-07-15 13:49:19.710632] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:32.249 crypto_ram3 00:26:32.249 [ 00:26:32.249 { 00:26:32.249 "name": "Malloc1", 00:26:32.249 "aliases": [ 00:26:32.249 "21fa7b81-c08a-44b2-bddd-f7a91c8bd321" 00:26:32.249 ], 00:26:32.249 "product_name": "Malloc disk", 00:26:32.249 "block_size": 4096, 00:26:32.249 "num_blocks": 4096, 00:26:32.249 "uuid": "21fa7b81-c08a-44b2-bddd-f7a91c8bd321", 00:26:32.249 "assigned_rate_limits": { 00:26:32.249 "rw_ios_per_sec": 0, 00:26:32.249 "rw_mbytes_per_sec": 0, 00:26:32.249 "r_mbytes_per_sec": 0, 00:26:32.249 "w_mbytes_per_sec": 0 00:26:32.249 }, 00:26:32.249 "claimed": true, 00:26:32.249 "claim_type": "exclusive_write", 00:26:32.249 "zoned": false, 00:26:32.249 "supported_io_types": { 00:26:32.249 "read": true, 00:26:32.249 "write": true, 00:26:32.249 "unmap": true, 00:26:32.249 "flush": true, 00:26:32.249 "reset": true, 00:26:32.249 "nvme_admin": false, 00:26:32.249 "nvme_io": false, 00:26:32.249 "nvme_io_md": false, 00:26:32.249 "write_zeroes": true, 00:26:32.249 "zcopy": true, 00:26:32.249 "get_zone_info": false, 00:26:32.249 "zone_management": false, 00:26:32.249 "zone_append": false, 00:26:32.249 "compare": false, 00:26:32.249 "compare_and_write": false, 00:26:32.249 "abort": true, 00:26:32.249 "seek_hole": false, 00:26:32.249 "seek_data": false, 00:26:32.249 "copy": true, 00:26:32.249 "nvme_iov_md": false 00:26:32.249 }, 00:26:32.249 "memory_domains": [ 00:26:32.249 { 00:26:32.249 "dma_device_id": "system", 00:26:32.249 "dma_device_type": 1 00:26:32.249 }, 00:26:32.249 { 00:26:32.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:32.249 "dma_device_type": 2 00:26:32.249 } 00:26:32.249 ], 00:26:32.249 "driver_specific": {} 00:26:32.249 } 00:26:32.249 ] 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:32.249 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:26:32.249 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:32.508 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:26:32.508 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "a4a0a4d0-b67b-5c40-950c-3131d8e4ad98"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a4a0a4d0-b67b-5c40-950c-3131d8e4ad98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "2293f4b4-8ccf-5e69-8e1a-c865f6b238d7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "2293f4b4-8ccf-5e69-8e1a-c865f6b238d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:32.508 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:26:32.508 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:26:32.508 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:26:32.508 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:26:32.508 13:49:19 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 127682 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 127682 ']' 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 127682 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 127682 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 127682' 00:26:32.508 killing process with pid 127682 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 127682 00:26:32.508 13:49:19 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 127682 00:26:32.767 13:49:20 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:32.767 13:49:20 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:32.767 13:49:20 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:32.767 13:49:20 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:32.767 13:49:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:33.025 ************************************ 00:26:33.025 START TEST bdev_hello_world 00:26:33.025 ************************************ 00:26:33.025 13:49:20 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:33.025 [2024-07-15 13:49:20.448595] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:33.025 [2024-07-15 13:49:20.448648] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid127885 ] 00:26:33.025 [2024-07-15 13:49:20.534927] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:33.025 [2024-07-15 13:49:20.624598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:33.284 [2024-07-15 13:49:20.789970] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:33.284 [2024-07-15 13:49:20.790038] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:33.284 [2024-07-15 13:49:20.790048] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:33.284 [2024-07-15 13:49:20.797988] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:33.284 [2024-07-15 13:49:20.798003] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:33.284 [2024-07-15 13:49:20.798010] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:33.284 [2024-07-15 13:49:20.806007] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:33.284 [2024-07-15 13:49:20.806018] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:33.284 [2024-07-15 13:49:20.806025] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:33.284 [2024-07-15 13:49:20.845719] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:26:33.284 [2024-07-15 13:49:20.845751] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:26:33.284 [2024-07-15 13:49:20.845763] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:26:33.284 [2024-07-15 13:49:20.847027] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:26:33.284 [2024-07-15 13:49:20.847109] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:26:33.284 [2024-07-15 13:49:20.847120] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:26:33.284 [2024-07-15 13:49:20.847145] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:26:33.284 00:26:33.284 [2024-07-15 13:49:20.847158] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:26:33.543 00:26:33.543 real 0m0.662s 00:26:33.543 user 0m0.450s 00:26:33.543 sys 0m0.199s 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:26:33.543 ************************************ 00:26:33.543 END TEST bdev_hello_world 00:26:33.543 ************************************ 00:26:33.543 13:49:21 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:33.543 13:49:21 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:26:33.543 13:49:21 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:33.543 13:49:21 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:33.543 13:49:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:33.543 ************************************ 00:26:33.543 START TEST bdev_bounds 00:26:33.543 ************************************ 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=128007 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 128007' 00:26:33.543 Process bdevio pid: 128007 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 128007 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 128007 ']' 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:33.543 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:33.543 13:49:21 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:33.802 [2024-07-15 13:49:21.200780] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:33.802 [2024-07-15 13:49:21.200836] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid128007 ] 00:26:33.802 [2024-07-15 13:49:21.289879] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:33.802 [2024-07-15 13:49:21.378694] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:33.802 [2024-07-15 13:49:21.378779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:33.802 [2024-07-15 13:49:21.378781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:34.062 [2024-07-15 13:49:21.547503] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:34.062 [2024-07-15 13:49:21.547561] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:34.062 [2024-07-15 13:49:21.547571] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:34.062 [2024-07-15 13:49:21.555521] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:34.062 [2024-07-15 13:49:21.555534] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:34.062 [2024-07-15 13:49:21.555542] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:34.062 [2024-07-15 13:49:21.563545] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:34.062 [2024-07-15 13:49:21.563557] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:34.062 [2024-07-15 13:49:21.563564] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:34.629 I/O targets: 00:26:34.629 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:26:34.629 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:26:34.629 00:26:34.629 00:26:34.629 CUnit - A unit testing framework for C - Version 2.1-3 00:26:34.629 http://cunit.sourceforge.net/ 00:26:34.629 00:26:34.629 00:26:34.629 Suite: bdevio tests on: crypto_ram3 00:26:34.629 Test: blockdev write read block ...passed 00:26:34.629 Test: blockdev write zeroes read block ...passed 00:26:34.629 Test: blockdev write zeroes read no split ...passed 00:26:34.629 Test: blockdev write zeroes read split ...passed 00:26:34.629 Test: blockdev write zeroes read split partial ...passed 00:26:34.629 Test: blockdev reset ...passed 00:26:34.629 Test: blockdev write read 8 blocks ...passed 00:26:34.629 Test: blockdev write read size > 128k ...passed 00:26:34.629 Test: blockdev write read invalid size ...passed 00:26:34.629 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:34.629 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:34.629 Test: blockdev write read max offset ...passed 00:26:34.629 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:34.629 Test: blockdev writev readv 8 blocks ...passed 00:26:34.629 Test: blockdev writev readv 30 x 1block ...passed 00:26:34.629 Test: blockdev writev readv block ...passed 00:26:34.629 Test: blockdev writev readv size > 128k ...passed 00:26:34.629 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:34.629 Test: blockdev comparev and writev ...passed 00:26:34.629 Test: blockdev nvme passthru rw ...passed 00:26:34.629 Test: blockdev nvme passthru vendor specific ...passed 00:26:34.629 Test: blockdev nvme admin passthru ...passed 00:26:34.629 Test: blockdev copy ...passed 00:26:34.629 Suite: bdevio tests on: crypto_ram 00:26:34.629 Test: blockdev write read block ...passed 00:26:34.629 Test: blockdev write zeroes read block ...passed 00:26:34.629 Test: blockdev write zeroes read no split ...passed 00:26:34.629 Test: blockdev write zeroes read split ...passed 00:26:34.629 Test: blockdev write zeroes read split partial ...passed 00:26:34.629 Test: blockdev reset ...passed 00:26:34.629 Test: blockdev write read 8 blocks ...passed 00:26:34.629 Test: blockdev write read size > 128k ...passed 00:26:34.629 Test: blockdev write read invalid size ...passed 00:26:34.629 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:34.629 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:34.629 Test: blockdev write read max offset ...passed 00:26:34.629 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:34.629 Test: blockdev writev readv 8 blocks ...passed 00:26:34.629 Test: blockdev writev readv 30 x 1block ...passed 00:26:34.629 Test: blockdev writev readv block ...passed 00:26:34.629 Test: blockdev writev readv size > 128k ...passed 00:26:34.629 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:34.629 Test: blockdev comparev and writev ...passed 00:26:34.629 Test: blockdev nvme passthru rw ...passed 00:26:34.629 Test: blockdev nvme passthru vendor specific ...passed 00:26:34.629 Test: blockdev nvme admin passthru ...passed 00:26:34.629 Test: blockdev copy ...passed 00:26:34.629 00:26:34.629 Run Summary: Type Total Ran Passed Failed Inactive 00:26:34.629 suites 2 2 n/a 0 0 00:26:34.629 tests 46 46 46 0 0 00:26:34.629 asserts 260 260 260 0 n/a 00:26:34.629 00:26:34.629 Elapsed time = 0.086 seconds 00:26:34.629 0 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 128007 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 128007 ']' 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 128007 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 128007 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 128007' 00:26:34.629 killing process with pid 128007 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 128007 00:26:34.629 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 128007 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:26:34.888 00:26:34.888 real 0m1.237s 00:26:34.888 user 0m3.168s 00:26:34.888 sys 0m0.316s 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:34.888 ************************************ 00:26:34.888 END TEST bdev_bounds 00:26:34.888 ************************************ 00:26:34.888 13:49:22 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:34.888 13:49:22 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:26:34.888 13:49:22 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:34.888 13:49:22 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:34.888 13:49:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:34.888 ************************************ 00:26:34.888 START TEST bdev_nbd 00:26:34.888 ************************************ 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=128208 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 128208 /var/tmp/spdk-nbd.sock 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 128208 ']' 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:26:34.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:34.888 13:49:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:35.146 [2024-07-15 13:49:22.534215] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:26:35.146 [2024-07-15 13:49:22.534272] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:35.146 [2024-07-15 13:49:22.626582] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:35.146 [2024-07-15 13:49:22.708718] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:35.405 [2024-07-15 13:49:22.868684] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:35.405 [2024-07-15 13:49:22.868758] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:35.405 [2024-07-15 13:49:22.868770] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:35.405 [2024-07-15 13:49:22.876701] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:35.405 [2024-07-15 13:49:22.876713] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:35.405 [2024-07-15 13:49:22.876721] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:35.405 [2024-07-15 13:49:22.884722] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:35.405 [2024-07-15 13:49:22.884733] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:35.405 [2024-07-15 13:49:22.884740] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:35.971 1+0 records in 00:26:35.971 1+0 records out 00:26:35.971 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208088 s, 19.7 MB/s 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:35.971 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:36.230 1+0 records in 00:26:36.230 1+0 records out 00:26:36.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258547 s, 15.8 MB/s 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:36.230 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:36.487 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:26:36.487 { 00:26:36.487 "nbd_device": "/dev/nbd0", 00:26:36.487 "bdev_name": "crypto_ram" 00:26:36.487 }, 00:26:36.487 { 00:26:36.487 "nbd_device": "/dev/nbd1", 00:26:36.487 "bdev_name": "crypto_ram3" 00:26:36.487 } 00:26:36.487 ]' 00:26:36.487 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:26:36.487 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:26:36.487 { 00:26:36.487 "nbd_device": "/dev/nbd0", 00:26:36.487 "bdev_name": "crypto_ram" 00:26:36.487 }, 00:26:36.487 { 00:26:36.487 "nbd_device": "/dev/nbd1", 00:26:36.487 "bdev_name": "crypto_ram3" 00:26:36.487 } 00:26:36.487 ]' 00:26:36.487 13:49:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:26:36.487 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:36.487 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:36.487 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:36.487 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:36.487 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:36.487 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:36.487 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:36.745 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:37.003 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:26:37.328 /dev/nbd0 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:37.328 1+0 records in 00:26:37.328 1+0 records out 00:26:37.328 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257787 s, 15.9 MB/s 00:26:37.328 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:37.329 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:37.329 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:37.329 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:37.329 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:37.329 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:37.329 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:37.329 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:26:37.586 /dev/nbd1 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:37.586 1+0 records in 00:26:37.586 1+0 records out 00:26:37.586 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000188995 s, 21.7 MB/s 00:26:37.586 13:49:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:26:37.586 { 00:26:37.586 "nbd_device": "/dev/nbd0", 00:26:37.586 "bdev_name": "crypto_ram" 00:26:37.586 }, 00:26:37.586 { 00:26:37.586 "nbd_device": "/dev/nbd1", 00:26:37.586 "bdev_name": "crypto_ram3" 00:26:37.586 } 00:26:37.586 ]' 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:26:37.586 { 00:26:37.586 "nbd_device": "/dev/nbd0", 00:26:37.586 "bdev_name": "crypto_ram" 00:26:37.586 }, 00:26:37.586 { 00:26:37.586 "nbd_device": "/dev/nbd1", 00:26:37.586 "bdev_name": "crypto_ram3" 00:26:37.586 } 00:26:37.586 ]' 00:26:37.586 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:26:37.844 /dev/nbd1' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:26:37.844 /dev/nbd1' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:26:37.844 256+0 records in 00:26:37.844 256+0 records out 00:26:37.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113515 s, 92.4 MB/s 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:26:37.844 256+0 records in 00:26:37.844 256+0 records out 00:26:37.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0173238 s, 60.5 MB/s 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:26:37.844 256+0 records in 00:26:37.844 256+0 records out 00:26:37.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0261822 s, 40.0 MB/s 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:37.844 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:38.101 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:38.102 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:38.102 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:38.102 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:26:38.379 13:49:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:26:38.637 malloc_lvol_verify 00:26:38.637 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:26:38.896 e88852f2-c739-44f4-ba2c-bf06ee8cc8b9 00:26:38.896 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:26:38.896 7760605f-562c-4dce-a51e-24fe615c4977 00:26:38.896 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:26:39.154 /dev/nbd0 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:26:39.154 mke2fs 1.46.5 (30-Dec-2021) 00:26:39.154 Discarding device blocks: 0/4096 done 00:26:39.154 Creating filesystem with 4096 1k blocks and 1024 inodes 00:26:39.154 00:26:39.154 Allocating group tables: 0/1 done 00:26:39.154 Writing inode tables: 0/1 done 00:26:39.154 Creating journal (1024 blocks): done 00:26:39.154 Writing superblocks and filesystem accounting information: 0/1 done 00:26:39.154 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:39.154 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 128208 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 128208 ']' 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 128208 00:26:39.413 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:26:39.414 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:39.414 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 128208 00:26:39.414 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:39.414 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:39.414 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 128208' 00:26:39.414 killing process with pid 128208 00:26:39.414 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 128208 00:26:39.414 13:49:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 128208 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:26:39.673 00:26:39.673 real 0m4.633s 00:26:39.673 user 0m6.362s 00:26:39.673 sys 0m1.876s 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:39.673 ************************************ 00:26:39.673 END TEST bdev_nbd 00:26:39.673 ************************************ 00:26:39.673 13:49:27 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:39.673 13:49:27 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:26:39.673 13:49:27 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:26:39.673 13:49:27 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:26:39.673 13:49:27 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:26:39.673 13:49:27 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:39.673 13:49:27 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:39.673 13:49:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:39.673 ************************************ 00:26:39.673 START TEST bdev_fio 00:26:39.673 ************************************ 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:39.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:39.673 ************************************ 00:26:39.673 START TEST bdev_fio_rw_verify 00:26:39.673 ************************************ 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:26:39.673 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:39.674 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:26:39.674 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:26:39.674 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:39.938 13:49:27 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:40.198 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:40.198 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:40.198 fio-3.35 00:26:40.198 Starting 2 threads 00:26:52.399 00:26:52.399 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=129157: Mon Jul 15 13:49:38 2024 00:26:52.399 read: IOPS=31.1k, BW=121MiB/s (127MB/s)(1215MiB/10000msec) 00:26:52.399 slat (nsec): min=8887, max=68667, avg=14241.61, stdev=3066.28 00:26:52.399 clat (usec): min=5, max=283, avg=103.33, stdev=41.81 00:26:52.399 lat (usec): min=18, max=308, avg=117.57, stdev=42.94 00:26:52.399 clat percentiles (usec): 00:26:52.399 | 50.000th=[ 101], 99.000th=[ 202], 99.900th=[ 221], 99.990th=[ 239], 00:26:52.399 | 99.999th=[ 265] 00:26:52.399 write: IOPS=37.4k, BW=146MiB/s (153MB/s)(1386MiB/9478msec); 0 zone resets 00:26:52.399 slat (usec): min=9, max=462, avg=23.72, stdev= 3.76 00:26:52.399 clat (usec): min=16, max=863, avg=138.23, stdev=63.49 00:26:52.399 lat (usec): min=33, max=982, avg=161.95, stdev=64.82 00:26:52.399 clat percentiles (usec): 00:26:52.399 | 50.000th=[ 135], 99.000th=[ 277], 99.900th=[ 306], 99.990th=[ 644], 00:26:52.399 | 99.999th=[ 857] 00:26:52.399 bw ( KiB/s): min=135704, max=147784, per=94.67%, avg=141785.79, stdev=2099.08, samples=38 00:26:52.399 iops : min=33926, max=36946, avg=35446.42, stdev=524.78, samples=38 00:26:52.399 lat (usec) : 10=0.01%, 20=0.01%, 50=8.77%, 100=31.24%, 250=57.37% 00:26:52.400 lat (usec) : 500=2.59%, 750=0.01%, 1000=0.01% 00:26:52.400 cpu : usr=99.67%, sys=0.01%, ctx=39, majf=0, minf=465 00:26:52.400 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:52.400 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:52.400 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:52.400 issued rwts: total=310998,354864,0,0 short=0,0,0,0 dropped=0,0,0,0 00:26:52.400 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:52.400 00:26:52.400 Run status group 0 (all jobs): 00:26:52.400 READ: bw=121MiB/s (127MB/s), 121MiB/s-121MiB/s (127MB/s-127MB/s), io=1215MiB (1274MB), run=10000-10000msec 00:26:52.400 WRITE: bw=146MiB/s (153MB/s), 146MiB/s-146MiB/s (153MB/s-153MB/s), io=1386MiB (1454MB), run=9478-9478msec 00:26:52.400 00:26:52.400 real 0m10.978s 00:26:52.400 user 0m23.419s 00:26:52.400 sys 0m0.285s 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:26:52.400 ************************************ 00:26:52.400 END TEST bdev_fio_rw_verify 00:26:52.400 ************************************ 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "a4a0a4d0-b67b-5c40-950c-3131d8e4ad98"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a4a0a4d0-b67b-5c40-950c-3131d8e4ad98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "2293f4b4-8ccf-5e69-8e1a-c865f6b238d7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "2293f4b4-8ccf-5e69-8e1a-c865f6b238d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:26:52.400 crypto_ram3 ]] 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "a4a0a4d0-b67b-5c40-950c-3131d8e4ad98"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a4a0a4d0-b67b-5c40-950c-3131d8e4ad98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "2293f4b4-8ccf-5e69-8e1a-c865f6b238d7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "2293f4b4-8ccf-5e69-8e1a-c865f6b238d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:52.400 13:49:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:52.400 ************************************ 00:26:52.400 START TEST bdev_fio_trim 00:26:52.400 ************************************ 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:52.401 13:49:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:52.401 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:52.401 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:52.401 fio-3.35 00:26:52.401 Starting 2 threads 00:27:02.374 00:27:02.374 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=130724: Mon Jul 15 13:49:49 2024 00:27:02.374 write: IOPS=55.6k, BW=217MiB/s (228MB/s)(2171MiB/10001msec); 0 zone resets 00:27:02.374 slat (usec): min=9, max=1475, avg=15.84, stdev= 3.99 00:27:02.374 clat (usec): min=24, max=1747, avg=117.85, stdev=65.57 00:27:02.374 lat (usec): min=33, max=1767, avg=133.69, stdev=68.14 00:27:02.374 clat percentiles (usec): 00:27:02.374 | 50.000th=[ 94], 99.000th=[ 249], 99.900th=[ 277], 99.990th=[ 449], 00:27:02.374 | 99.999th=[ 578] 00:27:02.374 bw ( KiB/s): min=217456, max=227144, per=100.00%, avg=222380.63, stdev=958.54, samples=38 00:27:02.374 iops : min=54364, max=56786, avg=55595.16, stdev=239.64, samples=38 00:27:02.374 trim: IOPS=55.6k, BW=217MiB/s (228MB/s)(2171MiB/10001msec); 0 zone resets 00:27:02.374 slat (nsec): min=4045, max=50320, avg=7339.02, stdev=1812.41 00:27:02.374 clat (usec): min=19, max=1595, avg=78.50, stdev=24.37 00:27:02.374 lat (usec): min=25, max=1603, avg=85.84, stdev=24.56 00:27:02.374 clat percentiles (usec): 00:27:02.374 | 50.000th=[ 79], 99.000th=[ 133], 99.900th=[ 147], 99.990th=[ 251], 00:27:02.374 | 99.999th=[ 510] 00:27:02.374 bw ( KiB/s): min=217448, max=227144, per=100.00%, avg=222381.89, stdev=959.10, samples=38 00:27:02.374 iops : min=54362, max=56786, avg=55595.47, stdev=239.78, samples=38 00:27:02.374 lat (usec) : 20=0.01%, 50=15.34%, 100=50.48%, 250=33.69%, 500=0.49% 00:27:02.374 lat (usec) : 750=0.01% 00:27:02.374 lat (msec) : 2=0.01% 00:27:02.374 cpu : usr=99.69%, sys=0.01%, ctx=29, majf=0, minf=254 00:27:02.374 IO depths : 1=7.5%, 2=17.5%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:02.374 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:02.374 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:02.374 issued rwts: total=0,555841,555842,0 short=0,0,0,0 dropped=0,0,0,0 00:27:02.374 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:02.374 00:27:02.374 Run status group 0 (all jobs): 00:27:02.374 WRITE: bw=217MiB/s (228MB/s), 217MiB/s-217MiB/s (228MB/s-228MB/s), io=2171MiB (2277MB), run=10001-10001msec 00:27:02.374 TRIM: bw=217MiB/s (228MB/s), 217MiB/s-217MiB/s (228MB/s-228MB/s), io=2171MiB (2277MB), run=10001-10001msec 00:27:02.374 00:27:02.374 real 0m11.033s 00:27:02.374 user 0m23.437s 00:27:02.374 sys 0m0.419s 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:27:02.374 ************************************ 00:27:02.374 END TEST bdev_fio_trim 00:27:02.374 ************************************ 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:27:02.374 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:27:02.374 00:27:02.374 real 0m22.356s 00:27:02.374 user 0m47.033s 00:27:02.374 sys 0m0.868s 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:02.374 ************************************ 00:27:02.374 END TEST bdev_fio 00:27:02.374 ************************************ 00:27:02.374 13:49:49 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:02.374 13:49:49 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:02.374 13:49:49 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:02.374 13:49:49 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:27:02.374 13:49:49 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:02.374 13:49:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:02.374 ************************************ 00:27:02.374 START TEST bdev_verify 00:27:02.374 ************************************ 00:27:02.374 13:49:49 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:02.374 [2024-07-15 13:49:49.690563] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:02.374 [2024-07-15 13:49:49.690618] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid131985 ] 00:27:02.374 [2024-07-15 13:49:49.777080] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:02.374 [2024-07-15 13:49:49.866706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:02.374 [2024-07-15 13:49:49.866708] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:02.633 [2024-07-15 13:49:50.034136] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:02.633 [2024-07-15 13:49:50.034267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:02.633 [2024-07-15 13:49:50.034302] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:02.633 [2024-07-15 13:49:50.042096] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:02.633 [2024-07-15 13:49:50.042114] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:02.633 [2024-07-15 13:49:50.042124] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:02.633 [2024-07-15 13:49:50.050118] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:02.633 [2024-07-15 13:49:50.050132] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:02.633 [2024-07-15 13:49:50.050140] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:02.633 Running I/O for 5 seconds... 00:27:07.902 00:27:07.902 Latency(us) 00:27:07.902 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:07.902 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:07.902 Verification LBA range: start 0x0 length 0x800 00:27:07.902 crypto_ram : 5.00 8339.24 32.58 0.00 0.00 15294.12 1154.00 18805.98 00:27:07.902 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:07.902 Verification LBA range: start 0x800 length 0x800 00:27:07.902 crypto_ram : 5.02 8344.22 32.59 0.00 0.00 15286.64 1260.86 18805.98 00:27:07.902 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:07.902 Verification LBA range: start 0x0 length 0x800 00:27:07.902 crypto_ram3 : 5.02 4185.23 16.35 0.00 0.00 30461.78 1666.89 21313.45 00:27:07.902 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:07.902 Verification LBA range: start 0x800 length 0x800 00:27:07.902 crypto_ram3 : 5.02 4180.89 16.33 0.00 0.00 30493.49 1168.25 21655.37 00:27:07.902 =================================================================================================================== 00:27:07.902 Total : 25049.58 97.85 0.00 0.00 20368.28 1154.00 21655.37 00:27:07.902 00:27:07.902 real 0m5.716s 00:27:07.902 user 0m10.838s 00:27:07.902 sys 0m0.214s 00:27:07.902 13:49:55 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:07.902 13:49:55 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:27:07.902 ************************************ 00:27:07.902 END TEST bdev_verify 00:27:07.902 ************************************ 00:27:07.902 13:49:55 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:07.902 13:49:55 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:07.902 13:49:55 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:27:07.902 13:49:55 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:07.902 13:49:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:07.902 ************************************ 00:27:07.902 START TEST bdev_verify_big_io 00:27:07.902 ************************************ 00:27:07.902 13:49:55 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:07.902 [2024-07-15 13:49:55.497507] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:07.902 [2024-07-15 13:49:55.497563] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid132708 ] 00:27:08.161 [2024-07-15 13:49:55.587774] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:08.161 [2024-07-15 13:49:55.677011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:08.161 [2024-07-15 13:49:55.677014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:08.419 [2024-07-15 13:49:55.847943] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:08.419 [2024-07-15 13:49:55.848007] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:08.419 [2024-07-15 13:49:55.848018] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:08.419 [2024-07-15 13:49:55.855961] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:08.419 [2024-07-15 13:49:55.855974] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:08.419 [2024-07-15 13:49:55.855982] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:08.419 [2024-07-15 13:49:55.863983] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:08.419 [2024-07-15 13:49:55.863999] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:08.419 [2024-07-15 13:49:55.864007] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:08.419 Running I/O for 5 seconds... 00:27:13.695 00:27:13.695 Latency(us) 00:27:13.695 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:13.695 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:13.695 Verification LBA range: start 0x0 length 0x80 00:27:13.695 crypto_ram : 5.05 735.73 45.98 0.00 0.00 171104.62 5242.88 225215.89 00:27:13.695 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:13.695 Verification LBA range: start 0x80 length 0x80 00:27:13.695 crypto_ram : 5.05 734.64 45.92 0.00 0.00 171305.47 5014.93 224304.08 00:27:13.695 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:13.695 Verification LBA range: start 0x0 length 0x80 00:27:13.695 crypto_ram3 : 5.17 395.75 24.73 0.00 0.00 311020.86 4786.98 237069.36 00:27:13.695 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:13.695 Verification LBA range: start 0x80 length 0x80 00:27:13.695 crypto_ram3 : 5.18 395.26 24.70 0.00 0.00 311443.99 4188.61 238892.97 00:27:13.695 =================================================================================================================== 00:27:13.695 Total : 2261.38 141.34 0.00 0.00 220992.56 4188.61 238892.97 00:27:13.953 00:27:13.953 real 0m5.898s 00:27:13.953 user 0m11.168s 00:27:13.953 sys 0m0.234s 00:27:13.953 13:50:01 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:13.953 13:50:01 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:27:13.953 ************************************ 00:27:13.953 END TEST bdev_verify_big_io 00:27:13.953 ************************************ 00:27:13.953 13:50:01 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:13.953 13:50:01 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:13.953 13:50:01 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:27:13.953 13:50:01 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:13.953 13:50:01 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:13.953 ************************************ 00:27:13.953 START TEST bdev_write_zeroes 00:27:13.953 ************************************ 00:27:13.953 13:50:01 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:13.953 [2024-07-15 13:50:01.463550] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:13.953 [2024-07-15 13:50:01.463597] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133585 ] 00:27:13.953 [2024-07-15 13:50:01.548706] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:14.211 [2024-07-15 13:50:01.634036] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:14.211 [2024-07-15 13:50:01.801548] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:14.211 [2024-07-15 13:50:01.801602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:14.211 [2024-07-15 13:50:01.801612] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:14.211 [2024-07-15 13:50:01.809564] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:14.211 [2024-07-15 13:50:01.809576] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:14.211 [2024-07-15 13:50:01.809583] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:14.211 [2024-07-15 13:50:01.817584] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:14.211 [2024-07-15 13:50:01.817595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:14.211 [2024-07-15 13:50:01.817602] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:14.470 Running I/O for 1 seconds... 00:27:15.408 00:27:15.408 Latency(us) 00:27:15.408 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:15.408 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:15.408 crypto_ram : 1.00 41857.30 163.51 0.00 0.00 3052.04 826.32 4559.03 00:27:15.408 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:15.408 crypto_ram3 : 1.01 20974.50 81.93 0.00 0.00 6072.52 1061.40 6753.06 00:27:15.408 =================================================================================================================== 00:27:15.408 Total : 62831.80 245.44 0.00 0.00 4062.95 826.32 6753.06 00:27:15.668 00:27:15.668 real 0m1.669s 00:27:15.668 user 0m1.452s 00:27:15.668 sys 0m0.196s 00:27:15.668 13:50:03 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:15.668 13:50:03 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:27:15.668 ************************************ 00:27:15.668 END TEST bdev_write_zeroes 00:27:15.668 ************************************ 00:27:15.668 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:15.668 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:15.668 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:27:15.668 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:15.668 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:15.668 ************************************ 00:27:15.668 START TEST bdev_json_nonenclosed 00:27:15.668 ************************************ 00:27:15.668 13:50:03 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:15.668 [2024-07-15 13:50:03.205540] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:15.668 [2024-07-15 13:50:03.205580] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133780 ] 00:27:15.940 [2024-07-15 13:50:03.291454] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:15.940 [2024-07-15 13:50:03.375441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:15.940 [2024-07-15 13:50:03.375501] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:27:15.940 [2024-07-15 13:50:03.375515] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:15.940 [2024-07-15 13:50:03.375524] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:15.940 00:27:15.940 real 0m0.311s 00:27:15.940 user 0m0.204s 00:27:15.940 sys 0m0.105s 00:27:15.940 13:50:03 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:27:15.940 13:50:03 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:15.940 13:50:03 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:27:15.940 ************************************ 00:27:15.940 END TEST bdev_json_nonenclosed 00:27:15.940 ************************************ 00:27:15.940 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:27:15.940 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:27:15.940 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:15.940 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:27:15.940 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:15.940 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:15.940 ************************************ 00:27:15.940 START TEST bdev_json_nonarray 00:27:15.940 ************************************ 00:27:15.940 13:50:03 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:16.354 [2024-07-15 13:50:03.583981] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:16.354 [2024-07-15 13:50:03.584041] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133812 ] 00:27:16.354 [2024-07-15 13:50:03.667181] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:16.354 [2024-07-15 13:50:03.747462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:16.354 [2024-07-15 13:50:03.747524] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:27:16.354 [2024-07-15 13:50:03.747538] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:16.354 [2024-07-15 13:50:03.747546] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:16.354 00:27:16.354 real 0m0.298s 00:27:16.354 user 0m0.185s 00:27:16.354 sys 0m0.111s 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:27:16.354 ************************************ 00:27:16.354 END TEST bdev_json_nonarray 00:27:16.354 ************************************ 00:27:16.354 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:27:16.354 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:27:16.354 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:27:16.354 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:27:16.354 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:27:16.354 13:50:03 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:27:16.354 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:27:16.354 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:16.354 13:50:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:16.354 ************************************ 00:27:16.354 START TEST bdev_crypto_enomem 00:27:16.354 ************************************ 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=133872 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 133872 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 133872 ']' 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:16.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:16.354 13:50:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:16.354 [2024-07-15 13:50:03.953156] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:16.354 [2024-07-15 13:50:03.953208] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133872 ] 00:27:16.613 [2024-07-15 13:50:04.042896] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:16.613 [2024-07-15 13:50:04.124788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:17.181 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:17.181 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:17.182 true 00:27:17.182 base0 00:27:17.182 true 00:27:17.182 [2024-07-15 13:50:04.785264] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:17.182 crypt0 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:17.182 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:17.440 [ 00:27:17.440 { 00:27:17.440 "name": "crypt0", 00:27:17.440 "aliases": [ 00:27:17.440 "3e2eac42-1f94-5e40-aad7-9b53ca1c56a8" 00:27:17.440 ], 00:27:17.440 "product_name": "crypto", 00:27:17.440 "block_size": 512, 00:27:17.440 "num_blocks": 2097152, 00:27:17.440 "uuid": "3e2eac42-1f94-5e40-aad7-9b53ca1c56a8", 00:27:17.440 "assigned_rate_limits": { 00:27:17.440 "rw_ios_per_sec": 0, 00:27:17.440 "rw_mbytes_per_sec": 0, 00:27:17.440 "r_mbytes_per_sec": 0, 00:27:17.440 "w_mbytes_per_sec": 0 00:27:17.440 }, 00:27:17.440 "claimed": false, 00:27:17.440 "zoned": false, 00:27:17.440 "supported_io_types": { 00:27:17.440 "read": true, 00:27:17.440 "write": true, 00:27:17.440 "unmap": false, 00:27:17.440 "flush": false, 00:27:17.440 "reset": true, 00:27:17.440 "nvme_admin": false, 00:27:17.440 "nvme_io": false, 00:27:17.440 "nvme_io_md": false, 00:27:17.440 "write_zeroes": true, 00:27:17.440 "zcopy": false, 00:27:17.440 "get_zone_info": false, 00:27:17.440 "zone_management": false, 00:27:17.440 "zone_append": false, 00:27:17.440 "compare": false, 00:27:17.440 "compare_and_write": false, 00:27:17.440 "abort": false, 00:27:17.440 "seek_hole": false, 00:27:17.440 "seek_data": false, 00:27:17.440 "copy": false, 00:27:17.440 "nvme_iov_md": false 00:27:17.440 }, 00:27:17.440 "memory_domains": [ 00:27:17.440 { 00:27:17.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:17.440 "dma_device_type": 2 00:27:17.440 } 00:27:17.440 ], 00:27:17.440 "driver_specific": { 00:27:17.440 "crypto": { 00:27:17.440 "base_bdev_name": "EE_base0", 00:27:17.440 "name": "crypt0", 00:27:17.440 "key_name": "test_dek_sw" 00:27:17.440 } 00:27:17.440 } 00:27:17.440 } 00:27:17.440 ] 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=134013 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:27:17.440 13:50:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:17.440 Running I/O for 5 seconds... 00:27:18.409 13:50:05 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:27:18.409 13:50:05 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:18.409 13:50:05 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:18.409 13:50:05 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:18.409 13:50:05 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 134013 00:27:22.594 00:27:22.594 Latency(us) 00:27:22.594 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:22.594 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:27:22.594 crypt0 : 5.00 57200.47 223.44 0.00 0.00 557.26 260.01 815.64 00:27:22.594 =================================================================================================================== 00:27:22.594 Total : 57200.47 223.44 0.00 0.00 557.26 260.01 815.64 00:27:22.594 0 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 133872 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 133872 ']' 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 133872 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 133872 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 133872' 00:27:22.594 killing process with pid 133872 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 133872 00:27:22.594 Received shutdown signal, test time was about 5.000000 seconds 00:27:22.594 00:27:22.594 Latency(us) 00:27:22.594 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:22.594 =================================================================================================================== 00:27:22.594 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:22.594 13:50:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 133872 00:27:22.594 13:50:10 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:27:22.594 00:27:22.594 real 0m6.254s 00:27:22.594 user 0m6.424s 00:27:22.594 sys 0m0.306s 00:27:22.594 13:50:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:22.594 13:50:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:22.594 ************************************ 00:27:22.594 END TEST bdev_crypto_enomem 00:27:22.594 ************************************ 00:27:22.594 13:50:10 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:27:22.594 13:50:10 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:27:22.594 00:27:22.594 real 0m51.739s 00:27:22.594 user 1m29.503s 00:27:22.594 sys 0m5.578s 00:27:22.594 13:50:10 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:22.594 13:50:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:22.594 ************************************ 00:27:22.594 END TEST blockdev_crypto_sw 00:27:22.594 ************************************ 00:27:22.852 13:50:10 -- common/autotest_common.sh@1142 -- # return 0 00:27:22.852 13:50:10 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:27:22.852 13:50:10 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:22.852 13:50:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:22.852 13:50:10 -- common/autotest_common.sh@10 -- # set +x 00:27:22.852 ************************************ 00:27:22.852 START TEST blockdev_crypto_qat 00:27:22.852 ************************************ 00:27:22.852 13:50:10 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:27:22.852 * Looking for test storage... 00:27:22.852 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=134771 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:27:22.852 13:50:10 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 134771 00:27:22.852 13:50:10 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 134771 ']' 00:27:22.852 13:50:10 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:22.852 13:50:10 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:22.852 13:50:10 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:22.852 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:22.852 13:50:10 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:22.852 13:50:10 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:22.852 [2024-07-15 13:50:10.441973] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:22.853 [2024-07-15 13:50:10.442065] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid134771 ] 00:27:23.111 [2024-07-15 13:50:10.529984] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:23.111 [2024-07-15 13:50:10.611175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:23.677 13:50:11 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:23.677 13:50:11 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:27:23.677 13:50:11 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:27:23.677 13:50:11 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:27:23.677 13:50:11 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:27:23.677 13:50:11 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:23.677 13:50:11 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:23.677 [2024-07-15 13:50:11.253132] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:23.677 [2024-07-15 13:50:11.261160] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:23.677 [2024-07-15 13:50:11.269175] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:23.936 [2024-07-15 13:50:11.343270] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:26.467 true 00:27:26.467 true 00:27:26.467 true 00:27:26.467 true 00:27:26.467 Malloc0 00:27:26.467 Malloc1 00:27:26.467 Malloc2 00:27:26.467 Malloc3 00:27:26.467 [2024-07-15 13:50:13.662805] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:26.467 crypto_ram 00:27:26.467 [2024-07-15 13:50:13.670823] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:26.467 crypto_ram1 00:27:26.467 [2024-07-15 13:50:13.678842] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:26.467 crypto_ram2 00:27:26.467 [2024-07-15 13:50:13.686862] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:26.467 crypto_ram3 00:27:26.467 [ 00:27:26.467 { 00:27:26.467 "name": "Malloc1", 00:27:26.467 "aliases": [ 00:27:26.467 "65860ab4-10a1-4fc4-b1e3-1d311cfed5d7" 00:27:26.467 ], 00:27:26.467 "product_name": "Malloc disk", 00:27:26.467 "block_size": 512, 00:27:26.467 "num_blocks": 65536, 00:27:26.467 "uuid": "65860ab4-10a1-4fc4-b1e3-1d311cfed5d7", 00:27:26.467 "assigned_rate_limits": { 00:27:26.467 "rw_ios_per_sec": 0, 00:27:26.467 "rw_mbytes_per_sec": 0, 00:27:26.467 "r_mbytes_per_sec": 0, 00:27:26.467 "w_mbytes_per_sec": 0 00:27:26.467 }, 00:27:26.467 "claimed": true, 00:27:26.467 "claim_type": "exclusive_write", 00:27:26.467 "zoned": false, 00:27:26.467 "supported_io_types": { 00:27:26.467 "read": true, 00:27:26.467 "write": true, 00:27:26.467 "unmap": true, 00:27:26.467 "flush": true, 00:27:26.467 "reset": true, 00:27:26.467 "nvme_admin": false, 00:27:26.467 "nvme_io": false, 00:27:26.467 "nvme_io_md": false, 00:27:26.467 "write_zeroes": true, 00:27:26.467 "zcopy": true, 00:27:26.467 "get_zone_info": false, 00:27:26.467 "zone_management": false, 00:27:26.467 "zone_append": false, 00:27:26.467 "compare": false, 00:27:26.467 "compare_and_write": false, 00:27:26.467 "abort": true, 00:27:26.467 "seek_hole": false, 00:27:26.467 "seek_data": false, 00:27:26.467 "copy": true, 00:27:26.467 "nvme_iov_md": false 00:27:26.467 }, 00:27:26.467 "memory_domains": [ 00:27:26.467 { 00:27:26.467 "dma_device_id": "system", 00:27:26.467 "dma_device_type": 1 00:27:26.467 }, 00:27:26.467 { 00:27:26.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:26.467 "dma_device_type": 2 00:27:26.467 } 00:27:26.467 ], 00:27:26.467 "driver_specific": {} 00:27:26.467 } 00:27:26.467 ] 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:26.467 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:26.467 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:27:26.468 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "68927ac2-f1f4-5d2a-9617-aa1f31af3f7b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "68927ac2-f1f4-5d2a-9617-aa1f31af3f7b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "712767db-6b07-5c00-8a57-f218d83f7433"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "712767db-6b07-5c00-8a57-f218d83f7433",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5d57016b-87d2-5b1a-a099-4e6fd8be10b0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5d57016b-87d2-5b1a-a099-4e6fd8be10b0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b8acd58a-15e3-5ddc-bca7-75228a151249"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b8acd58a-15e3-5ddc-bca7-75228a151249",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:26.468 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:27:26.468 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:27:26.468 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:27:26.468 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:27:26.468 13:50:13 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 134771 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 134771 ']' 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 134771 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 134771 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 134771' 00:27:26.468 killing process with pid 134771 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 134771 00:27:26.468 13:50:13 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 134771 00:27:27.034 13:50:14 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:27.034 13:50:14 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:27.034 13:50:14 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:27.034 13:50:14 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:27.034 13:50:14 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:27.034 ************************************ 00:27:27.034 START TEST bdev_hello_world 00:27:27.034 ************************************ 00:27:27.034 13:50:14 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:27.034 [2024-07-15 13:50:14.572224] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:27.034 [2024-07-15 13:50:14.572266] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid135316 ] 00:27:27.293 [2024-07-15 13:50:14.657176] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:27.293 [2024-07-15 13:50:14.741065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:27.293 [2024-07-15 13:50:14.761943] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:27.293 [2024-07-15 13:50:14.769966] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:27.293 [2024-07-15 13:50:14.777982] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:27.293 [2024-07-15 13:50:14.878743] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:29.822 [2024-07-15 13:50:17.063759] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:29.822 [2024-07-15 13:50:17.063815] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:29.822 [2024-07-15 13:50:17.063825] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:29.822 [2024-07-15 13:50:17.071780] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:29.822 [2024-07-15 13:50:17.071793] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:29.822 [2024-07-15 13:50:17.071801] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:29.822 [2024-07-15 13:50:17.079798] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:29.822 [2024-07-15 13:50:17.079809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:29.822 [2024-07-15 13:50:17.079816] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:29.823 [2024-07-15 13:50:17.087817] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:29.823 [2024-07-15 13:50:17.087829] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:29.823 [2024-07-15 13:50:17.087840] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:29.823 [2024-07-15 13:50:17.155835] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:27:29.823 [2024-07-15 13:50:17.155870] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:27:29.823 [2024-07-15 13:50:17.155882] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:27:29.823 [2024-07-15 13:50:17.156796] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:27:29.823 [2024-07-15 13:50:17.156855] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:27:29.823 [2024-07-15 13:50:17.156867] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:27:29.823 [2024-07-15 13:50:17.156899] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:27:29.823 00:27:29.823 [2024-07-15 13:50:17.156913] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:27:30.081 00:27:30.081 real 0m2.964s 00:27:30.081 user 0m2.601s 00:27:30.081 sys 0m0.332s 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:27:30.081 ************************************ 00:27:30.081 END TEST bdev_hello_world 00:27:30.081 ************************************ 00:27:30.081 13:50:17 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:27:30.081 13:50:17 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:27:30.081 13:50:17 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:30.081 13:50:17 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:30.081 13:50:17 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:30.081 ************************************ 00:27:30.081 START TEST bdev_bounds 00:27:30.081 ************************************ 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=135689 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 135689' 00:27:30.081 Process bdevio pid: 135689 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 135689 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 135689 ']' 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:30.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:30.081 13:50:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:30.081 [2024-07-15 13:50:17.610896] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:30.081 [2024-07-15 13:50:17.610947] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid135689 ] 00:27:30.081 [2024-07-15 13:50:17.697306] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:30.339 [2024-07-15 13:50:17.786856] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:30.339 [2024-07-15 13:50:17.786947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:30.339 [2024-07-15 13:50:17.786949] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:30.339 [2024-07-15 13:50:17.808095] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:30.339 [2024-07-15 13:50:17.816122] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:30.339 [2024-07-15 13:50:17.824142] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:30.340 [2024-07-15 13:50:17.925933] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:32.883 [2024-07-15 13:50:20.102368] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:32.883 [2024-07-15 13:50:20.102435] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:32.883 [2024-07-15 13:50:20.102445] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:32.883 [2024-07-15 13:50:20.110384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:32.883 [2024-07-15 13:50:20.110397] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:32.883 [2024-07-15 13:50:20.110404] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:32.883 [2024-07-15 13:50:20.118406] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:32.883 [2024-07-15 13:50:20.118418] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:32.883 [2024-07-15 13:50:20.118425] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:32.883 [2024-07-15 13:50:20.126428] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:32.883 [2024-07-15 13:50:20.126439] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:32.883 [2024-07-15 13:50:20.126447] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:32.883 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:32.883 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:27:32.883 13:50:20 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:32.883 I/O targets: 00:27:32.883 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:27:32.883 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:27:32.883 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:27:32.883 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:27:32.883 00:27:32.883 00:27:32.883 CUnit - A unit testing framework for C - Version 2.1-3 00:27:32.883 http://cunit.sourceforge.net/ 00:27:32.883 00:27:32.883 00:27:32.883 Suite: bdevio tests on: crypto_ram3 00:27:32.883 Test: blockdev write read block ...passed 00:27:32.883 Test: blockdev write zeroes read block ...passed 00:27:32.883 Test: blockdev write zeroes read no split ...passed 00:27:32.883 Test: blockdev write zeroes read split ...passed 00:27:32.883 Test: blockdev write zeroes read split partial ...passed 00:27:32.883 Test: blockdev reset ...passed 00:27:32.883 Test: blockdev write read 8 blocks ...passed 00:27:32.883 Test: blockdev write read size > 128k ...passed 00:27:32.883 Test: blockdev write read invalid size ...passed 00:27:32.883 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:32.883 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:32.883 Test: blockdev write read max offset ...passed 00:27:32.883 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:32.883 Test: blockdev writev readv 8 blocks ...passed 00:27:32.883 Test: blockdev writev readv 30 x 1block ...passed 00:27:32.883 Test: blockdev writev readv block ...passed 00:27:32.883 Test: blockdev writev readv size > 128k ...passed 00:27:32.883 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:32.883 Test: blockdev comparev and writev ...passed 00:27:32.883 Test: blockdev nvme passthru rw ...passed 00:27:32.883 Test: blockdev nvme passthru vendor specific ...passed 00:27:32.883 Test: blockdev nvme admin passthru ...passed 00:27:32.883 Test: blockdev copy ...passed 00:27:32.883 Suite: bdevio tests on: crypto_ram2 00:27:32.883 Test: blockdev write read block ...passed 00:27:32.883 Test: blockdev write zeroes read block ...passed 00:27:32.883 Test: blockdev write zeroes read no split ...passed 00:27:32.883 Test: blockdev write zeroes read split ...passed 00:27:32.883 Test: blockdev write zeroes read split partial ...passed 00:27:32.883 Test: blockdev reset ...passed 00:27:32.883 Test: blockdev write read 8 blocks ...passed 00:27:32.883 Test: blockdev write read size > 128k ...passed 00:27:32.883 Test: blockdev write read invalid size ...passed 00:27:32.883 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:32.883 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:32.883 Test: blockdev write read max offset ...passed 00:27:32.883 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:32.883 Test: blockdev writev readv 8 blocks ...passed 00:27:32.883 Test: blockdev writev readv 30 x 1block ...passed 00:27:32.883 Test: blockdev writev readv block ...passed 00:27:32.883 Test: blockdev writev readv size > 128k ...passed 00:27:32.883 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:32.883 Test: blockdev comparev and writev ...passed 00:27:32.883 Test: blockdev nvme passthru rw ...passed 00:27:32.883 Test: blockdev nvme passthru vendor specific ...passed 00:27:32.883 Test: blockdev nvme admin passthru ...passed 00:27:32.883 Test: blockdev copy ...passed 00:27:32.883 Suite: bdevio tests on: crypto_ram1 00:27:32.883 Test: blockdev write read block ...passed 00:27:32.883 Test: blockdev write zeroes read block ...passed 00:27:32.883 Test: blockdev write zeroes read no split ...passed 00:27:32.883 Test: blockdev write zeroes read split ...passed 00:27:32.883 Test: blockdev write zeroes read split partial ...passed 00:27:32.883 Test: blockdev reset ...passed 00:27:32.883 Test: blockdev write read 8 blocks ...passed 00:27:32.883 Test: blockdev write read size > 128k ...passed 00:27:32.883 Test: blockdev write read invalid size ...passed 00:27:32.883 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:32.883 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:32.883 Test: blockdev write read max offset ...passed 00:27:32.883 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:32.883 Test: blockdev writev readv 8 blocks ...passed 00:27:32.883 Test: blockdev writev readv 30 x 1block ...passed 00:27:32.883 Test: blockdev writev readv block ...passed 00:27:32.883 Test: blockdev writev readv size > 128k ...passed 00:27:32.883 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:32.883 Test: blockdev comparev and writev ...passed 00:27:32.883 Test: blockdev nvme passthru rw ...passed 00:27:32.883 Test: blockdev nvme passthru vendor specific ...passed 00:27:32.883 Test: blockdev nvme admin passthru ...passed 00:27:32.883 Test: blockdev copy ...passed 00:27:32.883 Suite: bdevio tests on: crypto_ram 00:27:32.883 Test: blockdev write read block ...passed 00:27:32.883 Test: blockdev write zeroes read block ...passed 00:27:32.883 Test: blockdev write zeroes read no split ...passed 00:27:33.142 Test: blockdev write zeroes read split ...passed 00:27:33.142 Test: blockdev write zeroes read split partial ...passed 00:27:33.142 Test: blockdev reset ...passed 00:27:33.142 Test: blockdev write read 8 blocks ...passed 00:27:33.142 Test: blockdev write read size > 128k ...passed 00:27:33.142 Test: blockdev write read invalid size ...passed 00:27:33.142 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:33.142 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:33.142 Test: blockdev write read max offset ...passed 00:27:33.142 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:33.142 Test: blockdev writev readv 8 blocks ...passed 00:27:33.142 Test: blockdev writev readv 30 x 1block ...passed 00:27:33.142 Test: blockdev writev readv block ...passed 00:27:33.142 Test: blockdev writev readv size > 128k ...passed 00:27:33.142 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:33.142 Test: blockdev comparev and writev ...passed 00:27:33.142 Test: blockdev nvme passthru rw ...passed 00:27:33.142 Test: blockdev nvme passthru vendor specific ...passed 00:27:33.142 Test: blockdev nvme admin passthru ...passed 00:27:33.142 Test: blockdev copy ...passed 00:27:33.142 00:27:33.142 Run Summary: Type Total Ran Passed Failed Inactive 00:27:33.142 suites 4 4 n/a 0 0 00:27:33.142 tests 92 92 92 0 0 00:27:33.142 asserts 520 520 520 0 n/a 00:27:33.142 00:27:33.142 Elapsed time = 0.513 seconds 00:27:33.142 0 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 135689 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 135689 ']' 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 135689 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 135689 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 135689' 00:27:33.142 killing process with pid 135689 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 135689 00:27:33.142 13:50:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 135689 00:27:33.401 13:50:21 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:27:33.401 00:27:33.401 real 0m3.445s 00:27:33.401 user 0m9.586s 00:27:33.401 sys 0m0.508s 00:27:33.401 13:50:21 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:33.401 13:50:21 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:33.401 ************************************ 00:27:33.401 END TEST bdev_bounds 00:27:33.401 ************************************ 00:27:33.660 13:50:21 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:27:33.660 13:50:21 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:27:33.660 13:50:21 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:33.660 13:50:21 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:33.660 13:50:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:33.660 ************************************ 00:27:33.660 START TEST bdev_nbd 00:27:33.660 ************************************ 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=136241 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 136241 /var/tmp/spdk-nbd.sock 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 136241 ']' 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:27:33.660 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:33.660 13:50:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:33.660 [2024-07-15 13:50:21.126904] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:27:33.660 [2024-07-15 13:50:21.126958] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:33.660 [2024-07-15 13:50:21.209539] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:33.919 [2024-07-15 13:50:21.299498] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:33.919 [2024-07-15 13:50:21.320422] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:33.919 [2024-07-15 13:50:21.328442] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:33.919 [2024-07-15 13:50:21.336460] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:33.919 [2024-07-15 13:50:21.431051] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:36.450 [2024-07-15 13:50:23.600726] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:36.450 [2024-07-15 13:50:23.600777] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:36.450 [2024-07-15 13:50:23.600787] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:36.450 [2024-07-15 13:50:23.608747] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:36.450 [2024-07-15 13:50:23.608761] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:36.450 [2024-07-15 13:50:23.608769] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:36.450 [2024-07-15 13:50:23.616765] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:36.450 [2024-07-15 13:50:23.616775] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:36.450 [2024-07-15 13:50:23.616782] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:36.450 [2024-07-15 13:50:23.624786] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:36.450 [2024-07-15 13:50:23.624797] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:36.450 [2024-07-15 13:50:23.624804] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:36.450 1+0 records in 00:27:36.450 1+0 records out 00:27:36.450 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337475 s, 12.1 MB/s 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:36.450 13:50:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:36.709 1+0 records in 00:27:36.709 1+0 records out 00:27:36.709 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235362 s, 17.4 MB/s 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:36.709 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:36.968 1+0 records in 00:27:36.968 1+0 records out 00:27:36.968 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245885 s, 16.7 MB/s 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:36.968 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:37.226 1+0 records in 00:27:37.226 1+0 records out 00:27:37.226 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329489 s, 12.4 MB/s 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd0", 00:27:37.227 "bdev_name": "crypto_ram" 00:27:37.227 }, 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd1", 00:27:37.227 "bdev_name": "crypto_ram1" 00:27:37.227 }, 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd2", 00:27:37.227 "bdev_name": "crypto_ram2" 00:27:37.227 }, 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd3", 00:27:37.227 "bdev_name": "crypto_ram3" 00:27:37.227 } 00:27:37.227 ]' 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd0", 00:27:37.227 "bdev_name": "crypto_ram" 00:27:37.227 }, 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd1", 00:27:37.227 "bdev_name": "crypto_ram1" 00:27:37.227 }, 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd2", 00:27:37.227 "bdev_name": "crypto_ram2" 00:27:37.227 }, 00:27:37.227 { 00:27:37.227 "nbd_device": "/dev/nbd3", 00:27:37.227 "bdev_name": "crypto_ram3" 00:27:37.227 } 00:27:37.227 ]' 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:37.227 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:37.485 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:37.485 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:37.485 13:50:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:37.485 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:37.485 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:37.485 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:37.485 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:37.485 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:37.485 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:37.485 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:37.744 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:38.002 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:38.260 13:50:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:27:38.518 /dev/nbd0 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:38.518 1+0 records in 00:27:38.518 1+0 records out 00:27:38.518 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278211 s, 14.7 MB/s 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:38.518 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:27:38.777 /dev/nbd1 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:38.777 1+0 records in 00:27:38.777 1+0 records out 00:27:38.777 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306397 s, 13.4 MB/s 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:38.777 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:27:39.035 /dev/nbd10 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:39.035 1+0 records in 00:27:39.035 1+0 records out 00:27:39.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000172946 s, 23.7 MB/s 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:39.035 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:27:39.294 /dev/nbd11 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:39.294 1+0 records in 00:27:39.294 1+0 records out 00:27:39.294 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000370464 s, 11.1 MB/s 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd0", 00:27:39.294 "bdev_name": "crypto_ram" 00:27:39.294 }, 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd1", 00:27:39.294 "bdev_name": "crypto_ram1" 00:27:39.294 }, 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd10", 00:27:39.294 "bdev_name": "crypto_ram2" 00:27:39.294 }, 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd11", 00:27:39.294 "bdev_name": "crypto_ram3" 00:27:39.294 } 00:27:39.294 ]' 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd0", 00:27:39.294 "bdev_name": "crypto_ram" 00:27:39.294 }, 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd1", 00:27:39.294 "bdev_name": "crypto_ram1" 00:27:39.294 }, 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd10", 00:27:39.294 "bdev_name": "crypto_ram2" 00:27:39.294 }, 00:27:39.294 { 00:27:39.294 "nbd_device": "/dev/nbd11", 00:27:39.294 "bdev_name": "crypto_ram3" 00:27:39.294 } 00:27:39.294 ]' 00:27:39.294 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:27:39.551 /dev/nbd1 00:27:39.551 /dev/nbd10 00:27:39.551 /dev/nbd11' 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:27:39.551 /dev/nbd1 00:27:39.551 /dev/nbd10 00:27:39.551 /dev/nbd11' 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:27:39.551 256+0 records in 00:27:39.551 256+0 records out 00:27:39.551 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109862 s, 95.4 MB/s 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:39.551 13:50:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:27:39.551 256+0 records in 00:27:39.551 256+0 records out 00:27:39.551 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0530108 s, 19.8 MB/s 00:27:39.551 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:39.551 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:27:39.551 256+0 records in 00:27:39.551 256+0 records out 00:27:39.551 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0441541 s, 23.7 MB/s 00:27:39.551 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:39.551 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:27:39.551 256+0 records in 00:27:39.551 256+0 records out 00:27:39.551 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.029081 s, 36.1 MB/s 00:27:39.551 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:39.551 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:27:39.551 256+0 records in 00:27:39.551 256+0 records out 00:27:39.551 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0369284 s, 28.4 MB/s 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:39.552 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:39.809 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:40.066 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:40.066 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:40.066 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:40.066 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:40.066 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:40.067 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:40.067 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:40.067 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:40.067 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:40.067 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:40.324 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:27:40.582 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:40.582 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:40.582 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:40.582 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:40.582 13:50:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:27:40.582 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:27:40.583 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:27:40.840 malloc_lvol_verify 00:27:40.840 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:27:41.097 b43d78c9-a775-4af9-8e57-2de0965a98ff 00:27:41.097 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:27:41.097 c541c0ac-10e3-402d-ac58-b5c3a09ceb99 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:27:41.354 /dev/nbd0 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:27:41.354 mke2fs 1.46.5 (30-Dec-2021) 00:27:41.354 Discarding device blocks: 0/4096 done 00:27:41.354 Creating filesystem with 4096 1k blocks and 1024 inodes 00:27:41.354 00:27:41.354 Allocating group tables: 0/1 done 00:27:41.354 Writing inode tables: 0/1 done 00:27:41.354 Creating journal (1024 blocks): done 00:27:41.354 Writing superblocks and filesystem accounting information: 0/1 done 00:27:41.354 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:41.354 13:50:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:41.612 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:41.612 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:41.612 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:41.612 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:41.612 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 136241 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 136241 ']' 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 136241 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 136241 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 136241' 00:27:41.613 killing process with pid 136241 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 136241 00:27:41.613 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 136241 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:27:42.181 00:27:42.181 real 0m8.490s 00:27:42.181 user 0m10.706s 00:27:42.181 sys 0m3.239s 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:42.181 ************************************ 00:27:42.181 END TEST bdev_nbd 00:27:42.181 ************************************ 00:27:42.181 13:50:29 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:27:42.181 13:50:29 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:27:42.181 13:50:29 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:27:42.181 13:50:29 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:27:42.181 13:50:29 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:27:42.181 13:50:29 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:42.181 13:50:29 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:42.181 13:50:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:42.181 ************************************ 00:27:42.181 START TEST bdev_fio 00:27:42.181 ************************************ 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:42.181 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:42.181 ************************************ 00:27:42.181 START TEST bdev_fio_rw_verify 00:27:42.181 ************************************ 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:42.181 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:42.440 13:50:29 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:42.698 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:42.698 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:42.698 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:42.698 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:42.698 fio-3.35 00:27:42.698 Starting 4 threads 00:27:57.649 00:27:57.649 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=137939: Mon Jul 15 13:50:42 2024 00:27:57.649 read: IOPS=26.2k, BW=102MiB/s (107MB/s)(1023MiB/10001msec) 00:27:57.649 slat (usec): min=11, max=1342, avg=53.54, stdev=32.32 00:27:57.649 clat (usec): min=12, max=2577, avg=295.95, stdev=191.41 00:27:57.649 lat (usec): min=45, max=2839, avg=349.49, stdev=207.11 00:27:57.649 clat percentiles (usec): 00:27:57.649 | 50.000th=[ 243], 99.000th=[ 906], 99.900th=[ 1090], 99.990th=[ 1418], 00:27:57.649 | 99.999th=[ 2442] 00:27:57.649 write: IOPS=28.9k, BW=113MiB/s (118MB/s)(1099MiB/9730msec); 0 zone resets 00:27:57.649 slat (usec): min=16, max=457, avg=62.34, stdev=30.94 00:27:57.649 clat (usec): min=15, max=1538, avg=328.57, stdev=197.36 00:27:57.649 lat (usec): min=43, max=1736, avg=390.91, stdev=211.33 00:27:57.649 clat percentiles (usec): 00:27:57.649 | 50.000th=[ 285], 99.000th=[ 955], 99.900th=[ 1106], 99.990th=[ 1205], 00:27:57.649 | 99.999th=[ 1483] 00:27:57.649 bw ( KiB/s): min=95488, max=161887, per=97.21%, avg=112412.58, stdev=4000.08, samples=76 00:27:57.649 iops : min=23872, max=40471, avg=28103.11, stdev=999.99, samples=76 00:27:57.649 lat (usec) : 20=0.01%, 50=0.09%, 100=7.53%, 250=39.34%, 500=37.09% 00:27:57.649 lat (usec) : 750=11.99%, 1000=3.49% 00:27:57.649 lat (msec) : 2=0.47%, 4=0.01% 00:27:57.649 cpu : usr=99.66%, sys=0.01%, ctx=71, majf=0, minf=286 00:27:57.649 IO depths : 1=3.9%, 2=27.5%, 4=54.9%, 8=13.7%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:57.649 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:57.649 complete : 0=0.0%, 4=87.9%, 8=12.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:57.649 issued rwts: total=261931,281289,0,0 short=0,0,0,0 dropped=0,0,0,0 00:27:57.649 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:57.649 00:27:57.649 Run status group 0 (all jobs): 00:27:57.649 READ: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=1023MiB (1073MB), run=10001-10001msec 00:27:57.649 WRITE: bw=113MiB/s (118MB/s), 113MiB/s-113MiB/s (118MB/s-118MB/s), io=1099MiB (1152MB), run=9730-9730msec 00:27:57.649 00:27:57.649 real 0m13.383s 00:27:57.649 user 0m45.194s 00:27:57.649 sys 0m0.479s 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:27:57.649 ************************************ 00:27:57.649 END TEST bdev_fio_rw_verify 00:27:57.649 ************************************ 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "68927ac2-f1f4-5d2a-9617-aa1f31af3f7b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "68927ac2-f1f4-5d2a-9617-aa1f31af3f7b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "712767db-6b07-5c00-8a57-f218d83f7433"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "712767db-6b07-5c00-8a57-f218d83f7433",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5d57016b-87d2-5b1a-a099-4e6fd8be10b0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5d57016b-87d2-5b1a-a099-4e6fd8be10b0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b8acd58a-15e3-5ddc-bca7-75228a151249"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b8acd58a-15e3-5ddc-bca7-75228a151249",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:27:57.649 crypto_ram1 00:27:57.649 crypto_ram2 00:27:57.649 crypto_ram3 ]] 00:27:57.649 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "68927ac2-f1f4-5d2a-9617-aa1f31af3f7b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "68927ac2-f1f4-5d2a-9617-aa1f31af3f7b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "712767db-6b07-5c00-8a57-f218d83f7433"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "712767db-6b07-5c00-8a57-f218d83f7433",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5d57016b-87d2-5b1a-a099-4e6fd8be10b0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5d57016b-87d2-5b1a-a099-4e6fd8be10b0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b8acd58a-15e3-5ddc-bca7-75228a151249"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b8acd58a-15e3-5ddc-bca7-75228a151249",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:57.650 ************************************ 00:27:57.650 START TEST bdev_fio_trim 00:27:57.650 ************************************ 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:57.650 13:50:43 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.650 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:57.650 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:57.650 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:57.650 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:57.650 fio-3.35 00:27:57.650 Starting 4 threads 00:28:09.908 00:28:09.908 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=139801: Mon Jul 15 13:50:56 2024 00:28:09.908 write: IOPS=42.6k, BW=167MiB/s (175MB/s)(1666MiB/10001msec); 0 zone resets 00:28:09.908 slat (usec): min=13, max=1227, avg=51.93, stdev=24.49 00:28:09.908 clat (usec): min=15, max=1444, avg=201.94, stdev=102.11 00:28:09.908 lat (usec): min=51, max=1659, avg=253.87, stdev=114.27 00:28:09.908 clat percentiles (usec): 00:28:09.908 | 50.000th=[ 186], 99.000th=[ 490], 99.900th=[ 611], 99.990th=[ 725], 00:28:09.908 | 99.999th=[ 1254] 00:28:09.908 bw ( KiB/s): min=156000, max=263769, per=100.00%, avg=171235.00, stdev=8378.97, samples=76 00:28:09.908 iops : min=39000, max=65942, avg=42808.74, stdev=2094.73, samples=76 00:28:09.908 trim: IOPS=42.6k, BW=167MiB/s (175MB/s)(1666MiB/10001msec); 0 zone resets 00:28:09.908 slat (usec): min=4, max=340, avg=15.83, stdev= 6.94 00:28:09.908 clat (usec): min=50, max=1660, avg=254.00, stdev=114.28 00:28:09.908 lat (usec): min=56, max=1706, avg=269.83, stdev=116.27 00:28:09.908 clat percentiles (usec): 00:28:09.908 | 50.000th=[ 239], 99.000th=[ 578], 99.900th=[ 701], 99.990th=[ 898], 00:28:09.908 | 99.999th=[ 1500] 00:28:09.908 bw ( KiB/s): min=156000, max=263769, per=100.00%, avg=171235.00, stdev=8378.97, samples=76 00:28:09.908 iops : min=39000, max=65942, avg=42808.74, stdev=2094.73, samples=76 00:28:09.908 lat (usec) : 20=0.01%, 50=1.00%, 100=9.37%, 250=51.26%, 500=36.49% 00:28:09.908 lat (usec) : 750=1.86%, 1000=0.01% 00:28:09.908 lat (msec) : 2=0.01% 00:28:09.908 cpu : usr=99.66%, sys=0.00%, ctx=64, majf=0, minf=117 00:28:09.908 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:09.908 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:09.908 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:09.908 issued rwts: total=0,426519,426520,0 short=0,0,0,0 dropped=0,0,0,0 00:28:09.908 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:09.908 00:28:09.908 Run status group 0 (all jobs): 00:28:09.908 WRITE: bw=167MiB/s (175MB/s), 167MiB/s-167MiB/s (175MB/s-175MB/s), io=1666MiB (1747MB), run=10001-10001msec 00:28:09.908 TRIM: bw=167MiB/s (175MB/s), 167MiB/s-167MiB/s (175MB/s-175MB/s), io=1666MiB (1747MB), run=10001-10001msec 00:28:09.908 00:28:09.908 real 0m13.409s 00:28:09.908 user 0m45.456s 00:28:09.908 sys 0m0.476s 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:28:09.908 ************************************ 00:28:09.908 END TEST bdev_fio_trim 00:28:09.908 ************************************ 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:28:09.908 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:28:09.908 00:28:09.908 real 0m27.158s 00:28:09.908 user 1m30.837s 00:28:09.908 sys 0m1.157s 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:09.908 ************************************ 00:28:09.908 END TEST bdev_fio 00:28:09.908 ************************************ 00:28:09.908 13:50:56 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:28:09.908 13:50:56 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:09.908 13:50:56 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:09.908 13:50:56 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:28:09.908 13:50:56 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:09.908 13:50:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:09.908 ************************************ 00:28:09.908 START TEST bdev_verify 00:28:09.908 ************************************ 00:28:09.908 13:50:56 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:09.908 [2024-07-15 13:50:56.954602] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:09.908 [2024-07-15 13:50:56.954651] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid141220 ] 00:28:09.908 [2024-07-15 13:50:57.040107] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:09.908 [2024-07-15 13:50:57.128608] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:09.908 [2024-07-15 13:50:57.128610] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:09.908 [2024-07-15 13:50:57.149636] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:09.908 [2024-07-15 13:50:57.157663] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:09.908 [2024-07-15 13:50:57.165683] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:09.908 [2024-07-15 13:50:57.268381] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:12.440 [2024-07-15 13:50:59.457685] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:12.441 [2024-07-15 13:50:59.457759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:12.441 [2024-07-15 13:50:59.457770] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:12.441 [2024-07-15 13:50:59.465702] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:12.441 [2024-07-15 13:50:59.465716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:12.441 [2024-07-15 13:50:59.465724] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:12.441 [2024-07-15 13:50:59.473722] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:12.441 [2024-07-15 13:50:59.473734] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:12.441 [2024-07-15 13:50:59.473741] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:12.441 [2024-07-15 13:50:59.481755] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:12.441 [2024-07-15 13:50:59.481769] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:12.441 [2024-07-15 13:50:59.481777] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:12.441 Running I/O for 5 seconds... 00:28:17.706 00:28:17.706 Latency(us) 00:28:17.706 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:17.706 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x0 length 0x1000 00:28:17.706 crypto_ram : 5.05 701.38 2.74 0.00 0.00 181907.87 2379.24 132211.76 00:28:17.706 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x1000 length 0x1000 00:28:17.706 crypto_ram : 5.05 709.54 2.77 0.00 0.00 180164.00 3647.22 131299.95 00:28:17.706 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x0 length 0x1000 00:28:17.706 crypto_ram1 : 5.05 702.83 2.75 0.00 0.00 181233.15 2507.46 121270.09 00:28:17.706 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x1000 length 0x1000 00:28:17.706 crypto_ram1 : 5.05 709.42 2.77 0.00 0.00 179788.57 3989.15 120358.29 00:28:17.706 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x0 length 0x1000 00:28:17.706 crypto_ram2 : 5.04 5499.58 21.48 0.00 0.00 23112.12 3262.55 20173.69 00:28:17.706 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x1000 length 0x1000 00:28:17.706 crypto_ram2 : 5.04 5559.74 21.72 0.00 0.00 22878.22 4245.59 19945.74 00:28:17.706 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x0 length 0x1000 00:28:17.706 crypto_ram3 : 5.04 5507.41 21.51 0.00 0.00 23050.95 2208.28 20173.69 00:28:17.706 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:17.706 Verification LBA range: start 0x1000 length 0x1000 00:28:17.706 crypto_ram3 : 5.04 5558.52 21.71 0.00 0.00 22835.32 3903.67 20173.69 00:28:17.706 =================================================================================================================== 00:28:17.706 Total : 24948.42 97.45 0.00 0.00 40858.72 2208.28 132211.76 00:28:17.706 00:28:17.706 real 0m8.088s 00:28:17.706 user 0m15.420s 00:28:17.706 sys 0m0.348s 00:28:17.706 13:51:04 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:17.706 13:51:04 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:28:17.706 ************************************ 00:28:17.706 END TEST bdev_verify 00:28:17.706 ************************************ 00:28:17.706 13:51:05 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:28:17.707 13:51:05 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:17.707 13:51:05 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:28:17.707 13:51:05 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:17.707 13:51:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:17.707 ************************************ 00:28:17.707 START TEST bdev_verify_big_io 00:28:17.707 ************************************ 00:28:17.707 13:51:05 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:17.707 [2024-07-15 13:51:05.115682] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:17.707 [2024-07-15 13:51:05.115722] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid142682 ] 00:28:17.707 [2024-07-15 13:51:05.200264] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:17.707 [2024-07-15 13:51:05.284744] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:17.707 [2024-07-15 13:51:05.284746] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:17.707 [2024-07-15 13:51:05.305784] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:17.707 [2024-07-15 13:51:05.313807] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:17.707 [2024-07-15 13:51:05.321828] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:17.965 [2024-07-15 13:51:05.427364] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:20.495 [2024-07-15 13:51:07.608297] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:20.495 [2024-07-15 13:51:07.608370] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:20.495 [2024-07-15 13:51:07.608380] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:20.495 [2024-07-15 13:51:07.616314] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:20.495 [2024-07-15 13:51:07.616328] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:20.495 [2024-07-15 13:51:07.616336] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:20.495 [2024-07-15 13:51:07.624334] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:20.495 [2024-07-15 13:51:07.624346] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:20.495 [2024-07-15 13:51:07.624354] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:20.495 [2024-07-15 13:51:07.632357] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:20.495 [2024-07-15 13:51:07.632369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:20.495 [2024-07-15 13:51:07.632377] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:20.495 Running I/O for 5 seconds... 00:28:20.756 [2024-07-15 13:51:08.242503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.242803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.242859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.242893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.242923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.242951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.243206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.243218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.245748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.245781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.245810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.245838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.246187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.246229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.246258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.246287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.246517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.246529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.249751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.250019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.250031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.252581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.252613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.252642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.252670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.252960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.252991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.253022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.253064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.253380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.253392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.255924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.255964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.256790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.259777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.260104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.260117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.262918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.263243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.263254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.265716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.265747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.265776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.265804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.756 [2024-07-15 13:51:08.266156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.266187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.266215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.266244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.266532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.266544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.268912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.268944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.268972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.269002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.269330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.269360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.269386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.269414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.269725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.269737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.272859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.275715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.276044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.276056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.278900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.279141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.279153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.281479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.281526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.281555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.281585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.281921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.281963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.282020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.282053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.282351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.282362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.284866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.284899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.284936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.284963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.285286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.285326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.285372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.285415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.285705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.285716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.287938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.287971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.288700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.290949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.290991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.291042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.291080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.291411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.757 [2024-07-15 13:51:08.291440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.291471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.291499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.291823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.291835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.294932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.297927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.299976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.300888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.300950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.300979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.301345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.301377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.301419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.301463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.301767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.301779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.304981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.307834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.308191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.308203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.310834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.311165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.311177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.313747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.314072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.314087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.316828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.317125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.317137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.319282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.319314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.319342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.319373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.319681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.758 [2024-07-15 13:51:08.319711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.319739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.319767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.320112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.320124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.322758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.323115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.323127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.325858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.327935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.327982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.328700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.330874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.330925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.330954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.330982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.331215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.331251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.331278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.331305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.331482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.331493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.332813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.332847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.332874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.332901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.333117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.333147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.333174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.333201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.333464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.333475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.335833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.335864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.335891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.335918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.336177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.336206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.336233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.336260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.336445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.336459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.337742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.337772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.337799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.337819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.338050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.338081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.338109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.338137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.338439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.338451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.341815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.342861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.343967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.344974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.346041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.347032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.348015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.348761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.349055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.349067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.351739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.352734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.353716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.354169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.355408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.759 [2024-07-15 13:51:08.356530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.357554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.357817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.358133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.358145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.360841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.361813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.362412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.363514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.364688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.365679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.366041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.366301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.366628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.366640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.369600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.370635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.371387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:20.760 [2024-07-15 13:51:08.372245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.373503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.374218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.374485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.374752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.375059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.375071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.377528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.377977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.378823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.379841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.381185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.381460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.381723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.381990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.382319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.382331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.384427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.385353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.386183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.387162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.387900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.388170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.388431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.388696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.389020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.389031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.390647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.391478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.392460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.393449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.393923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.394188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.394447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.394706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.394948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.394959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.397267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.398322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.399444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.400437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.400983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.401247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.401506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.401921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.402117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.402129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.404149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.405148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.406145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.406436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.407048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.407322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.407581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.408625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.408812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.408823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.411079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.412067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.412765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.413029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.413611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.413873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.414617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.415448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.415636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.415647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.417909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.419006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.419266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.419526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.420053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.420446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.421246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.422236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.422433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.422444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.424642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.023 [2024-07-15 13:51:08.425067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.425329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.425587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.426187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.427251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.428248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.429279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.429466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.429478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.431517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.431782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.432049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.432308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.433385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.434206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.435175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.436148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.436451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.436463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.437811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.438077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.438337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.438596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.439799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.440820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.441802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.442634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.442856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.442867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.444347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.444611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.444874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.445174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.446266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.447256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.448304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.448879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.449136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.449147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.450711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.450974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.451239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.452252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.453490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.454484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.454943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.455905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.456098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.456110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.457846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.458112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.458808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.459628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.460805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.461592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.462517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.463339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.463527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.463538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.465382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.465689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.466583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.467571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.468797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.469365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.470188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.471180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.471365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.471376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.473283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.474323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.475259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.476265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.476893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.477868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.478929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.479923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.480114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.480125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.482543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.483374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.484368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.485351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.486488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.487314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.488302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.489288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.489572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.489583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.492307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.493283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.494289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.495191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.496284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.497286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.498292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.499002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.499319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.499331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.502001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.024 [2024-07-15 13:51:08.503090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.504128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.505110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.506189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.507204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.508218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.508977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.509281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.509292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.511762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.512769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.513776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.514173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.515290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.516340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.517386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.517660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.517987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.518002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.520525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.521556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.522383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.523340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.524569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.525563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.526077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.526345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.526664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.526676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.529336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.530329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.531064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.531882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.533066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.533813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.534077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.534336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.534663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.534675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.536913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.537346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.538228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.539240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.540532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.540800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.541062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.541321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.541650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.541662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.543494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.544481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.545365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.546346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.547028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.547299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.547561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.547823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.548155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.548167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.549609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.550438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.551435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.552429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.552880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.553141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.553400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.553661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.553850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.553861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.555834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.556781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.557318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.557580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.558167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.558428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.558690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.558956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.559307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.559319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.561276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.561542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.561801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.562064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.562603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.562873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.563136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.025 [2024-07-15 13:51:08.563395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.563718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.563730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.565770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.566041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.566309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.566337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.566943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.567208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.567465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.567728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.568005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.568017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.570007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.570289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.570557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.570815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.570848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.571154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.571422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.571681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.571947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.572212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.572545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.572557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.574860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.575151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.575162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.576812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.576843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.576872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.576900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.577230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.577267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.577297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.577326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.577357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.577757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.577769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.579922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.580249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.580261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.581879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.581919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.581947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.581987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.582328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.582365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.582395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.582423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.582451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.582741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.582752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.584914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.585168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.585180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.586819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.586851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.586881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.586909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.587211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.587258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.587300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.026 [2024-07-15 13:51:08.587351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.587390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.587662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.587676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.589893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.590169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.590180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.592796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.593098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.593110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.594865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.594897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.594934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.594962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.595288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.595353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.595382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.595415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.595443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.595766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.595778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.597919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.598261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.598273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.027 [2024-07-15 13:51:08.600913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.600924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.602534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.602565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.602596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.602625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.602964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.603008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.603040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.603069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.603098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.603439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.603451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.605684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.606013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.606026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.607717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.607749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.607776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.607803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.608124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.608161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.028 [2024-07-15 13:51:08.608191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.608219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.608248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.608481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.608493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.610829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.611164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.611176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.612927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.612959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.612991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.613761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.615695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.615736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.615779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.615823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.616132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.616194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.616240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.616271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.616299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.616606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.616618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.618884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.619223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.619235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.620887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.620928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.620956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.620984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.621281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.621318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.621349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.621378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.621406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.621733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.621744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.623971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.624006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.624304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.624315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.625971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.626884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.628271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.628328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.628359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.628387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.628674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.628711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.029 [2024-07-15 13:51:08.628740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.628771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.628800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.629134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.629147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.630983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.631014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.631210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.631221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.632810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.633138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.633151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.634659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.634690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.634717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.634746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.634927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.634974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.635008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.635037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.635065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.635253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.635264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.636863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.637202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.030 [2024-07-15 13:51:08.637214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.638856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.638890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.638921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.638950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.639147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.639192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.639221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.639250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.639282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.639470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.639481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.640655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.640685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.640715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.640744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.640927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.640970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.641002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.641038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.641066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.641360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.641371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.643686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.644823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.644853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.291 [2024-07-15 13:51:08.644880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.644907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.645090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.645133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.645162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.645190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.645217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.645516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.645527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.647485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.647514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.648895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.650076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.650107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.650134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.651919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.654309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.655305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.655941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.657051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.657249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.658254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.659244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.659588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.659848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.660148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.660161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.662579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.663612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.664357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.665178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.665362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.666367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.667082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.667344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.667602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.667981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.667997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.670139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.670587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.671452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.672430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.672612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.673686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.673954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.674215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.674471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.674803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.674815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.676742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.677629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.678486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.679500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.679684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.680272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.680540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.680797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.681058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.681365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.681376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.682850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.683674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.684668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.685655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.685853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.686138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.686397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.686653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.686912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.687099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.687110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.689271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.690326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.691444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.692470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.692793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.693066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.693326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.693584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.694402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.694613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.292 [2024-07-15 13:51:08.694624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.696524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.697547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.698531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.698879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.699219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.699492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.699754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.700243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.701090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.701281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.701293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.703450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.704498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.705222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.705484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.705805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.706082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.706342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.707395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.708509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.708702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.708714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.710959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.711960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.712246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.712515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.712839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.713120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.713965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.714787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.715776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.715960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.715972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.718107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.718436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.718696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.718956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.719310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.719833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.720709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.721708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.722679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.722884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.722896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.724613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.724899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.725180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.725440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.725766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.726823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.727905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.729039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.730049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.730333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.730345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.731610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.731876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.732160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.732424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.732614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.733486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.734467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.735450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.735867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.736072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.736084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.737478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.737743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.738008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.738730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.738952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.740001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.741010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.741739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.742766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.743013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.743025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.744576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.744860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.745297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.746153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.746341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.747516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.748611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.749310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.750166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.750355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.750367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.752013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.752283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.753365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.754336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.754526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.755571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.756028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.756967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.758039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.758229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.758240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.760006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.760703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.293 [2024-07-15 13:51:08.761545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.762573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.762762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.763602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.764589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.765457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.766487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.766679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.766691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.768871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.769731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.770762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.771793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.771984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.772762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.773617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.774637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.775628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.775881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.775893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.778988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.780038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.781136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.782304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.782575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.783446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.784456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.785473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.786327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.786650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.786662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.789121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.790104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.791101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.791558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.791750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.792718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.793726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.794805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.795093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.795450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.795465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.797925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.798919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.799611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.800709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.800921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.801928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.802918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.803316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.803587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.803900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.803912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.806088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.806958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.807597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.808764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.808969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.809983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.810963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.811325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.811587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.811888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.811901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.814453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.815523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.816223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.817064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.817263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.818265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.819057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.819319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.819578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.819936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.819947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.822171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.822617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.823490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.824504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.824694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.825825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.826100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.826356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.826614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.826938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.826950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.828865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.829846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.830689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.831685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.831870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.832415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.832684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.832951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.833231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.833524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.833539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.835091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.835910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.836946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.837951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.294 [2024-07-15 13:51:08.838180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.838467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.838726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.838985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.839252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.839454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.839466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.841653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.842646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.843068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.843333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.843657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.843926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.844210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.844484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.844757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.845100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.845113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.847016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.847281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.847542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.847801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.848078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.848361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.848630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.848900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.849171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.849507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.849519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.851425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.851692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.851961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.852232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.852593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.852863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.853128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.853390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.853660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.853933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.853946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.856081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.856352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.856614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.856874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.857195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.857466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.857732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.857992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.858278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.858564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.858577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.860448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.860712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.860972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.861235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.861482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.861756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.862022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.862281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.862544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.862848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.862860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.864794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.865068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.865103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.865370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.865706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.865974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.866235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.866497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.866766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.867051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.867064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.869056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.869337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.869597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.869631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.869907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.870181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.870443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.870707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.870971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.871318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.871331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.873008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.873041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.873073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.873107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.295 [2024-07-15 13:51:08.873462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.873500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.873530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.873559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.873588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.873878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.873890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.875615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.875662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.875691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.875719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.876044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.876082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.876112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.876151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.876179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.876492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.876503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.878656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.879008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.879022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.880695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.880727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.880755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.880783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.881077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.881115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.881145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.881174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.881203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.881522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.881534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.883743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.884037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.884049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.885832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.885863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.885890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.885918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.886244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.886291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.886321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.886349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.886378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.886673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.886684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.888911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.889207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.889218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.890982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.891791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.893756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.893798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.893846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.893886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.894231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.894289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.894347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.894380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.894409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.894717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.894729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.896971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.897005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.897352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.897364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.899076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.899126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.899168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.899207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.899523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.296 [2024-07-15 13:51:08.899560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.899588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.899616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.899644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.899990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.900005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.901625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.901658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.901689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.901731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.902079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.902120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.902160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.902188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.902216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.902507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.902536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.904865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.905186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.905199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.906854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.906886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.906914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.297 [2024-07-15 13:51:08.906942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.907268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.907317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.907347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.907376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.907405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.907741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.907754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.558 [2024-07-15 13:51:08.909624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.909952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.909965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.911675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.911722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.911763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.911793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.912138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.912176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.912207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.912237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.912266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.912533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.912544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.913948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.913982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.914684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.915747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.915778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.915805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.915839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.916085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.916122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.916151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.916178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.916205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.916544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.916556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.918655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.919832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.919871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.919902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.919930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.920111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.920154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.920184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.920216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.920245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.920549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.920561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.922852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.924737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.926851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.926885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.926912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.926939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.927185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.927237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.927267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.927294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.927322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.927504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.927516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.928655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.928701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.928732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.928760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.928944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.928989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.929022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.929050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.559 [2024-07-15 13:51:08.929100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.929287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.929298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.931765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.932876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.932906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.932939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.932976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.933159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.933197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.933230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.933262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.933289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.933466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.933478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.934960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.934991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.935733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.936872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.936902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.936932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.936961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.937180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.937224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.937253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.937280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.937309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.937492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.937504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.938945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.938979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.939883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.940948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.940982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.941560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.942819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.942850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.943882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.944968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.945005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.945033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.946591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.948436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.949198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.950019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.951017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.951206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.951919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.560 [2024-07-15 13:51:08.953022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.953993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.954986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.955177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.955188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.957252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.958122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.959136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.960131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.960317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.960958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.961824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.962837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.963820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.964060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.964073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.966915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.967824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.968811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.969813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.970115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.971117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.972206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.973193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.974105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.974389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.974401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.976854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.977832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.978832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.979461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.979654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.980515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.981489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.982460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.982823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.983234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.983247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.985745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.986812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.987954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.988643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.988877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.989908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.990895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.991651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.991913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.992251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.992264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.994654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.995689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.996183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.997211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.997402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.998334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.999347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.999682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:08.999954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.000289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.000302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.002924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.003952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.004716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.005553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.005748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.006789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.007524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.007793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.008064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.008408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.008420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.010664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.011123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.012093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.013168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.013360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.014414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.014692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.014960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.015236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.015585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.015598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.017721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.018607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.019457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.020487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.020679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.021326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.021605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.021870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.022138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.022467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.022479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.023946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.024810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.025819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.026832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.027026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.561 [2024-07-15 13:51:09.027312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.027580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.027848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.028119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.028349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.028364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.030466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.031312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.032291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.033304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.033580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.033864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.034144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.034401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.035011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.035232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.035245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.037170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.038202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.039179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.039797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.040110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.040391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.040659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.040936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.042076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.042303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.042315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.044550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.045646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.046648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.046917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.047256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.047527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.047785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.048650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.049504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.049698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.049709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.051812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.052805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.053122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.053387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.053702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.053978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.054362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.055243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.056239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.056428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.056440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.058612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.059333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.059603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.059868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.060229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.060501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.061541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.062668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.063803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.063993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.064010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.066209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.066484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.066748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.067031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.067360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.067947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.068771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.069762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.070752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.070991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.071006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.072593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.072856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.073119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.073380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.073668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.074539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.075570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.076587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.077420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.077645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.077656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.078968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.079262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.079529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.079798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.079986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.080827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.081820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.082798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.083257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.083448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.083459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.084894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.085163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.085423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.086209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.562 [2024-07-15 13:51:09.086425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.087455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.088468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.089181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.090224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.090444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.090456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.091933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.092230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.092501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.093537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.093726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.094725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.095710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.096207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.097054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.097252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.097264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.098915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.099182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.100161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.101032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.101224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.102257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.102748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.103753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.104910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.105125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.105148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.106940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.107344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.108189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.109135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.109333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.109911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.110737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.111660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.112397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.112700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.112712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.115144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.116175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.117194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.117464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.117650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.118549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.119581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.120601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.120868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.121236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.121250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.123188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.123455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.123715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.123974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.124306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.124598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.124875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.125140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.125404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.125747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.125759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.127681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.127969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.128249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.128515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.128812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.129094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.129378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.129649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.129918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.130186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.130200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.132297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.132566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.132831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.133098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.133403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.133674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.133936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.134205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.134473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.134803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.134815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.136932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.563 [2024-07-15 13:51:09.137201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.137470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.137732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.137985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.138266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.138547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.138806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.139086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.139397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.139409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.141478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.141748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.142021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.142288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.142620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.142890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.143176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.143446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.143726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.144011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.144025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.146100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.146371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.146632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.146893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.147231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.147509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.147784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.148060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.148337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.148585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.148598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.150528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.150791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.151073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.151344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.151620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.151902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.152186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.152448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.152711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.153000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.153012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.154964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.155258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.155295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.155561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.155872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.156168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.156428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.156690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.156953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.157236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.157249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.159361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.159629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.159903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.159938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.160274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.160545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.160806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.161100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.161376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.161743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.161755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.163944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.164272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.164285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.165916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.165964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.165999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.564 [2024-07-15 13:51:09.166028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.166372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.166412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.166443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.166473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.166502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.166824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.166835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.168585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.168627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.168656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.168683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.169040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.169080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.169110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.169145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.169174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.169446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.169461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.171713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.172078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.565 [2024-07-15 13:51:09.172091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.173797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.173830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.173859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.173888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.174170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.174222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.174252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.174281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.174310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.174644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.174656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.176835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.177191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.177204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.179983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.181659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.181705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.181737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.181767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.182115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.182163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.182193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.182221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.182265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.182549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.182560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.184988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.186509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.826 [2024-07-15 13:51:09.186542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.186571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.186601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.186864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.186911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.186941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.186970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.187004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.187337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.187350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.189763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.190032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.190043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.191702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.193990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.195811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.197982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.198001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.199805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.200954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.200985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.201807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.203128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.203159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.203186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.203213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.203394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.827 [2024-07-15 13:51:09.203436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.203465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.203492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.203520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.203833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.203843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.204910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.204942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.204970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.205795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.207945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.209862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.211818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.212012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.212023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.213903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.215989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.216207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.216219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.217793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.218081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.218093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.828 [2024-07-15 13:51:09.220055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.220703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.221829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.221860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.221887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.221914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.222094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.222138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.222167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.222201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.222233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.222417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.222428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.224869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.226629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.228934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.230145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.230175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.231567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.233327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.233360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.233388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.234587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.236708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.237693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.237961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.238224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.238523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.238797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.239193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.240013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.241032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.241229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.241240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.243397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.244152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.244417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.829 [2024-07-15 13:51:09.244679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.244986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.245290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.246203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.247044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.248051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.248240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.248252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.250432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.250710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.250979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.251253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.251592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.252031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.252877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.253890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.254903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.255094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.255106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.256896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.257175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.257446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.257716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.258052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.259124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.260084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.261097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.262126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.262402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.262413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.263752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.264028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.264299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.264572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.264803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.265650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.266656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.267671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.268344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.268533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.268544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.269987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.270263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.270535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.270882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.271073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.272083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.273146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.274310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.274973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.275191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.275203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.276715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.277008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.277279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.278307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.278527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.279565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.280586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.281061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.282024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.282208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.282219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.283825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.284098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.284706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.285516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.285698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.286703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.287551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.288416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.289242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.289426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.289437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.291337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.291605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.292605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.293690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.293874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.294861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.295324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.296145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.297123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.297327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.297339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.830 [2024-07-15 13:51:09.299231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.300155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.300999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.302014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.302198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.302828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.303932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.304948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.306051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.306239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.306250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.308581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.309409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.310397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.311381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.311617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.312467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.313286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.314275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.315256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.315545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.315556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.318649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.319727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.320851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.321880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.322171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.323017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.324004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.324990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.325687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.326004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.326016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.328364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.329346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.330344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.330786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.330982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.331931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.332941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.334040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.334308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.334635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.334647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.337100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.338092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.338910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.339830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.340053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.341095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.342094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.342604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.342870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.343195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.343207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.345804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.346819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.347354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.348173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.348355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.349368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.350256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.350518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.350779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.351049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.351061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.353413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.353981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.355019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.831 [2024-07-15 13:51:09.356100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.356286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.357292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.357558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.357821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.358087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.358422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.358434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.360494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.361331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.362161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.363139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.363325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.363942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.364220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.364480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.364745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.365062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.365075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.366552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.367384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.368373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.369360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.369545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.369821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.370087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.370348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.370607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.370818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.370829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.372909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.373799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.374830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.375880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.376182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.376460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.376723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.376984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.377251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.377433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.377444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.379569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.380491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.380763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.381046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.381329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.381600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.382401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.383212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.384211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.384396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.384407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.386605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.387049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.387315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.387574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.387900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.388175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.388437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.388704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.388965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.389274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.389288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.391260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.391522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.391784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.392051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.392323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.392596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.392860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.393126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.393388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.393677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.393689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.395619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.395887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.396158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.396421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.396745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.397021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.397283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.397544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.397809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.398074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.398085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.400048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.400332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.400594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.400856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.401196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.401468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.832 [2024-07-15 13:51:09.401741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.402010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.402280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.402547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.402559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.404527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.404793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.405059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.405321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.405636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.405928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.406201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.406468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.406738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.407040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.407051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.408952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.409222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.409489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.409754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.410077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.410349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.410611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.410874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.411145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.411424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.411435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.413421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.413707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.413968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.414234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.414571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.414846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.415127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.415394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.415657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.415922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.415934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.418554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.418823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.419299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.420030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.420346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.420621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.420888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.421334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.422082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.422396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.422408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.424157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.424525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.425359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.425620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.425898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.426857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.427125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.427387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.427654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.428016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.428027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.429656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.429923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.430213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.430702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.430896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.431194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.431591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.432399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.432661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.432953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.432964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.434909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.435712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.435747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.436009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.436277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.436554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.437081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.437780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.438053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.438303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.438315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.440508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.833 [2024-07-15 13:51:09.441171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.834 [2024-07-15 13:51:09.441438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:21.834 [2024-07-15 13:51:09.441471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.441685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.442346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.442618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.442884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.443163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.443403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.443414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.444897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.444932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.444961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.444989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.445311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.445349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.445378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.445417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.445445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.445792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.445803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.447885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.448203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.448215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.449599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.449629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.449674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.449705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.450025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.450063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.450094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.450122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.450155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.450346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.450357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.452860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.454843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.455036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.455048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.456893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.456939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.094 [2024-07-15 13:51:09.456967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.456998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.457284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.457336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.457386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.457419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.457447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.457720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.457731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.459717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.460036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.460048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.461746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.462063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.462083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.463518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.463549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.463577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.463605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.463870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.463920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.463960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.464003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.464033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.464374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.464385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.466909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.468979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.470987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.471005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.472955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.473140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.473151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.474993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.476637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.476668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.476695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.095 [2024-07-15 13:51:09.476722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.476945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.476989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.477020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.477047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.477077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.477262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.477273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.478862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.479232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.479243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.481917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.485915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.486233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.486245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.489957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.492888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.493072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.493083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.496930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.499924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.500168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.500179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.503712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.506776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.506812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.506842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.096 [2024-07-15 13:51:09.506870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.507127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.507172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.507202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.507233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.507276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.507623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.507635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.510921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.511110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.511122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.513904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.513945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.513973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.514693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.517915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.519933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.519967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.520235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.520272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.520306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.520491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.540637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.540706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.541702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.549377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.550361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.550404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.551374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.551416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.551909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.552252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.552264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.554881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.555928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.556893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.557659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.558854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.559850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.560559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.560837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.561162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.561175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.563705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.564744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.565298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.566120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.567313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.568207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.568469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.568729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.569033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.569045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.571271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.571710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.572521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.573518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.574785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.575070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.575329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.575587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.575901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.575913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.577510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.578516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.579589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.580604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.581064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.581333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.581596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.581866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.582116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.582128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.584269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.585157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.097 [2024-07-15 13:51:09.586136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.587147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.587685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.587945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.588205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.588781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.589016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.589027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.590982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.591975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.592962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.593333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.593932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.594196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.594611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.595436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.595617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.595628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.597727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.598723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.599305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.599575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.600168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.600429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.601459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.602543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.602727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.602738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.604853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.605724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.605985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.606248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.606826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.607776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.608616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.609633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.609820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.609831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.612110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.612379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.612640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.612898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.614014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.614828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.615818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.616808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.617088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.617100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.618501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.618770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.619033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.619293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.620329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.621313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.622300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.622935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.623130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.623141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.624537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.624802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.625062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.625696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.626906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.627891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.628728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.629605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.629830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.629841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.631488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.631783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.632239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.633066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.634351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.635349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.636063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.636881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.637066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.637077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.638900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.639234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.640092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.641107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.098 [2024-07-15 13:51:09.642361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.642924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.643741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.644731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.644913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.644924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.646739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.647776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.648706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.649644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.651084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.652152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.653232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.653732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.653915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.653926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.655979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.656808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.657794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.658774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.659646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.660474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.661465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.662448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.662733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.662745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.665736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.666816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.667802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.668724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.669961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.671054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.672057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.672941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.673220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.673232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.675188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.675453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.675712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.675981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.676564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.676830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.677104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.677362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.677727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.677739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.679680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.679942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.680209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.680474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.681050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.681318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.681587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.681858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.682145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.682158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.684208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.684474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.684744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.685005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.685620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.685889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.686159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.686420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.686741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.686752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.688826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.689090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.689350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.689610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.690212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.690476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.690734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.690993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.691288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.691299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.693213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.693479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.693744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.694009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.694580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.694837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.695112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.695378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.695701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.695714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.697654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.697921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.698185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.698441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.699017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.699286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.699547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.699807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.700168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.700181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.702132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.702392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.702653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.702918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.099 [2024-07-15 13:51:09.703481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.703742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.704003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.704261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.704537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.704549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.706579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.706853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.707139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.707404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.708048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.708321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.708590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.708861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.709196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.100 [2024-07-15 13:51:09.709208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.711146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.711415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.711681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.711949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.712506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.712766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.713029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.713287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.713529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.713542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.715506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.715773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.716047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.716309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.716909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.717176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.717440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.717716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.718085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.718100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.720106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.720376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.720643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.720909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.721441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.721714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.721980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.722254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.722590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.722603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.724531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.724571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.724832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.725100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.725644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.725906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.726187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.726455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.726739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.726752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.729134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.730033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.730304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.730338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.730939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.730975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.731247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.731280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.731603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.731616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.733579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.733625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.733899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.733935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.734490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.734523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.734780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.734809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.735150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.735162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.736814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.736869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.737839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.737866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.739035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.739076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.740143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.740175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.740474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.740486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.742186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.742232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.742258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.742292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.743462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.743496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.744504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.744537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.744874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.744885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.745940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.745971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.746004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.361 [2024-07-15 13:51:09.746033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.746361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.746391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.746436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.746466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.746808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.746820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.748989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.750959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.752884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.753071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.753086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.754657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.755009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.755021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.756662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.756699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.756731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.756765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.756982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.757021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.757050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.757078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.757266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.757278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.758882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.759155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.759167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.760873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.760904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.760931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.760959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.761216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.761246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.761274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.761301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.761482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.761493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.762658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.762700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.762729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.762757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.762976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.763010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.763038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.763067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.763294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.763306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.765945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.767072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.362 [2024-07-15 13:51:09.767105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.767678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.769913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.770106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.770117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.771897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.773576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.773611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.773639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.773670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.773988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.774020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.774047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.774073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.774301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.774311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.775832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.776015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.776027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.777495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.777525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.777552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.777580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.777933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.777963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.777991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.778039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.778224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.778235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.779956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.781460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.781510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.781541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.781572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.781959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.781990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.782026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.782055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.782309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.782320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.783984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.363 [2024-07-15 13:51:09.785945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.786283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.786295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.787808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.788020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.788031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.789804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.790171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.790183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.791992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.793968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.795916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.796935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.796966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.796997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.797372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.797401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.797429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.797754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.821615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.821677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.821716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.821954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.823748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.824712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.825697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.826119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.827043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.827227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.827278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.828313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.828366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.829348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.829608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.829866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.830091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.830102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.832669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.833641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.834065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.834886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.835074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.836179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.837225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.837485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.837745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.838063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.838074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.840362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.840983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.842115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.843150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.843338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.844344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.844637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.364 [2024-07-15 13:51:09.844896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.845157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.845512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.845524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.847593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.848374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.849191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.850204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.850397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.851108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.851370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.851625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.851885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.852225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.852237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.853671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.854495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.855487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.856478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.856666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.856941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.857209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.857466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.857724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.857910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.857921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.860093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.861094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.862164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.863277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.863539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.863810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.864072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.864332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.865130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.865346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.865357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.867268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.868260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.869253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.869609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.869972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.870246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.870503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.870967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.871793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.871979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.871990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.874076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.875065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.875788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.876051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.876397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.876667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.876926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.878007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.879097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.879282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.879293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.881470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.882467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.882729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.882986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.883305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.883577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.884496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.885345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.886351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.886542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.886553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.888662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.889019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.889295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.889556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.889898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.890428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.891243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.892255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.365 [2024-07-15 13:51:09.893279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.893500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.893511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.895264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.895533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.895792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.896055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.896380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.897338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.898346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.899416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.899745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.899946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.899957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.901628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.901922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.902770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.903618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.903804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.904812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.905427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.906586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.907641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.907835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.907846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.909561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.910056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.910877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.911932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.912130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.913072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.913641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.914488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.915531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.915721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.915734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.917625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.917893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.918166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.918430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.918734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.919011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.919276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.919548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.919818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.920191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.920204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.922102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.922365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.922627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.922886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.923152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.923427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.923691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.923950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.924248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.924593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.924605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.926651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.926919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.927192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.927459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.927797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.928089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.928357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.928630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.928904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.929219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.929230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.931247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.931527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.931793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.932072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.932408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.932685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.932957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.933249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.933510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.933862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.933874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.935789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.936075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.936345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.936618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.936928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.937219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.937480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.937740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.938009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.938249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.938261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.940367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.940659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.940940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.941210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.941561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.366 [2024-07-15 13:51:09.941844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.942123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.942401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.942661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.942947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.942957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.944972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.945268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.945525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.945786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.946026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.946300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.946561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.946820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.947082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.947343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.947354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.949330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.949599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.949864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.950128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.950425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.950691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.950956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.951226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.951500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.951833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.951844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.953760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.954029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.954291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.954553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.954796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.955076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.955338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.955597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.955855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.956203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.956216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.958137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.958403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.958668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.958932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.959299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.959570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.959828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.960097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.960365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.960687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.960699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.962763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.963035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.963296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.963555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.963889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.964162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.964426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.964685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.964948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.965253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.965264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.967161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.967434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.967695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.967969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.968268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.968554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.968815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.969078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.969340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.969525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.969536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.971355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.971620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.971903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.971938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.972195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.972475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.972743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.973009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.973278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.973566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.973577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.975563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.975601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.975874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.975922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.367 [2024-07-15 13:51:09.976202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.977081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.977113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.977379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.977410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.977639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.977651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.980051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.981044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.628 [2024-07-15 13:51:09.981079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.981508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.981714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.982659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.983692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.983730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.984738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.985058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.985070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.987581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.988593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.988627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.989636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.989910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.990990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.991035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.992129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.993148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.993335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.993346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.995566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.995832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.995864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.996769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.997011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.997057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.998067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.999077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.999110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.999457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:09.999468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.000525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.000799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.000831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.001103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.001403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.001677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.002458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.002492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.003328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.003521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.003532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.004675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.005691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.005725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.006852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.007150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.008111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.008143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.008410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.008440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.008680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.008691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.010131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.010162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.010191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.010219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.010409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.011441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.011476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.011507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.011536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.011891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.011902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.012971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.013853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.015922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.016152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.016166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.017965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.629 [2024-07-15 13:51:10.019646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.019679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.019709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.019738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.019921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.019962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.019991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.020030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.020060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.020251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.020263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.021877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.022069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.022085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.023791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.023825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.023856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.023888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.024141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.024186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.024217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.024247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.024276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.024505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.024518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.025672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.025725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.025761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.025794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.025981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.026036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.026071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.026101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.026131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.026320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.026332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.027805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.027838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.027868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.027909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.028093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.028130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.028166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.028200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.028230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.028573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.028585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.029785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.029818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.029849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.029876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.030116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.030161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.030191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.030222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.030255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.030440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.030451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.031561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.031593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.031623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.031652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.031991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.032037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.032069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.032100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.032132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.032505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.032517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.033771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.033803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.033832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.033876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.034066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.034109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.034146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.034175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.034206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.034425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.034436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.035575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.035616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.035649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.035681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.035921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.035965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.036004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.036035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.036066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.036317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.036330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.037834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.630 [2024-07-15 13:51:10.037869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.037904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.037935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.038169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.038216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.038249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.038281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.038313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.038609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.038622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.040765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.041961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.041993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.042876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.044988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.046876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.048962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.049149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.049160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.050954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.052717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.052748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.052777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.052805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.053030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.053071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.053100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.053127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.053155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.053382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.053393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.054919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.055102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.055114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.056622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.631 [2024-07-15 13:51:10.056655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.056684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.056723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.056903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.056943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.056977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.057020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.057049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.057387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.057399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.058993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.059026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.059215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.059226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.060924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.061216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.061228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.062604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.062635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.062666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.063671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.063954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.064013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.064045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.064075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.064104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.064374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.064386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.065519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.066211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.066243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.066510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.066717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.066760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.067260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.067293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.067552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.067736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.067747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.068917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.068952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.069886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.069920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.070113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.070158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.070191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.071202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.071235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.071427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.071439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.073223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.073264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.074346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.074385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.074575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.074616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.075633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.075673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.075707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.075895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.075906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.077052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.077082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.077863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.077896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.078090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.078565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.078596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.078626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.078897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.079085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.079097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.082014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.082055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.082898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.082930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.083122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.083167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.083196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.084227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.084259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.084488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.084503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.087355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.087403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.088480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.632 [2024-07-15 13:51:10.088519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.088707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.088747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.089862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.089916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.090659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.090906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.090918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.094689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.095217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.095487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.096552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.096812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.096858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.097893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.098914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.099377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.099567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.099579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.100976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.101249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.101516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.102294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.102518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.103557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.104583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.105331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.106394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.106673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.106684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.111404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.111700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.112053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.112941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.113132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.114175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.115297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.115894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.116732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.116920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.116931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.118577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.118862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.119789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.120642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.120831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.121870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.122480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.123603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.124737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.124928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.124941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.127787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.128363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.129202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.130203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.130392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.131383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.132196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.133038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.134051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.134240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.134251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.136198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.137322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.138410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.139642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.139848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.140303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.141145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.142154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.143193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.143382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.143394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.147195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.148076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.149075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.150102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.150385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.151482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.152462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.633 [2024-07-15 13:51:10.153510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.154622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.154984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.155001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.157617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.158634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.159653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.160497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.160710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.161553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.162531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.163534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.164122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.164311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.164323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.167155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.168068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.168504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.169486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.169676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.170642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.170920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.171198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.171457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.171784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.171796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.173903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.174700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.175537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.176526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.176711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.177430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.178398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.178665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.178930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.179121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.179133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.181973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.182837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.183862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.184848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.185038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.185314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.185573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.185831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.186093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.186278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.186289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.188086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.188901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.189907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.190894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.191180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.191954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.192392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.192652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.193636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.193961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.193973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.196296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.196570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.197034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.197770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.198121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.198543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.199328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.199589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.199848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.200130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.200142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.202147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.202418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.203060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.203619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.203955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.204563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.205156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.205418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.205682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.205926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.205938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.208898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.209274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.209536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.210618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.210958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.211239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.211508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.211774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.212041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.212369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.212381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.215158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.215420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.215781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.216629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.216969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.217244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.217524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.217797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.218062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.218336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.218348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.221117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.634 [2024-07-15 13:51:10.221819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.222084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.222348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.222616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.222891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.223165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.223425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.223684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.223944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.223955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.226223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.226629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.226888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.227157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.227420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.227695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.227954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.228230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.228494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.228808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.228820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.230970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.231247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.231515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.231777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.232074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.232343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.232627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.232900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.233492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.233696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.233707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.235570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.235844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.236112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.236371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.236668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.236936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.237204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.237467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.238270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.238527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.238539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.240720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.240998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.241265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.241532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.241804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.242092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.243065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.243342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.243607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.243795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.635 [2024-07-15 13:51:10.243807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.245824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.246104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.246377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.246647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.246963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.247262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.248360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.248627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.249015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.249204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.249216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.251693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.251968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.252258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.252649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.252838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.253127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.253488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.254369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.254637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.254941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.254952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.256936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.257218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.257492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.258144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.258354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.258637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.259224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.259877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.260148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.260412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.260427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.262949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.263627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.264197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.897 [2024-07-15 13:51:10.264466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.264672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.265323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.265588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.266215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.266813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.267049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.267060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.268898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.269181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.269454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.269489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.269678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.270081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.270347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.271398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.271677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.272009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.272022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.274245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.274301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.274571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.274629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.274923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.275212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.275248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.276117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.276152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.276492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.276506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.278855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.279880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.279917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.280938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.281134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.281914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.282775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.282811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.283826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.284020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.284032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.286403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.287446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.287486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.288514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.288708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.289894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.289929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.290701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.291551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.291742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.291753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.293162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.294019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.294060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.294331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.294662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.294707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.295816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.296926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.296978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.297170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.297182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.300510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.301286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.301321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.301675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.302026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.302852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.303284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.303317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.303585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.303774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.303785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.898 [2024-07-15 13:51:10.304919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.305881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.305916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.306936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.307128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.308314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.308350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.309057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.309091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.309350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.309361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.311097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.311148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.311177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.311209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.311390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.312414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.312449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.312477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.312515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.312765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.312777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.313826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.313857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.313886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.313917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.314112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.314154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.314183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.314211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.314248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.314582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.314594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.316991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.317009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.318794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.320967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.321149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.321161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.322895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.325760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.326913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.326946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.326973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.327004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.327185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.327227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.327255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.899 [2024-07-15 13:51:10.327282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.327309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.327622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.327633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.331831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.332019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.332031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.333800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.336670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.336707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.336737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.336766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.336947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.336989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.337030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.337058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.337097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.337280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.337291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.338850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.339030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.339041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.341886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.343634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.346962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.348679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.900 [2024-07-15 13:51:10.351573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.351607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.351634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.351661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.351985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.352033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.352065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.352094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.352123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.352433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.352448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.353933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.354169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.354180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.357863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.357904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.357932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.357959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.358149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.358194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.358224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.358252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.358279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.358609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.358620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.359710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.359740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.359768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.359794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.360065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.360108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.360148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.360178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.360209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.360394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.360405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.363969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.365882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.368557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.368594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.368624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.368652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.368969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.369016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.369056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.369088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.369115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.369298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.369309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.370719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.901 [2024-07-15 13:51:10.370759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.370786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.370814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.370991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.371038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.371067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.371094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.371121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.371341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.371352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.374504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.374546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.374575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.374602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.374928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.374964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.375000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.375029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.375062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.375289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.375300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.376700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.376731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.376761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.377749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.377933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.377977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.378010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.378037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.378065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.378399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.378410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.381551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.381820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.381851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.382569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.382808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.382854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.383145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.383176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.384228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.384473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.384484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.385604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.385642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.386756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.386794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.386977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.387023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.387056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.387964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.388001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.388208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.388219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.390685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.390723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.391574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.391607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.391790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.391832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.392819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.392853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.392887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.393145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.393157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.396008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.396044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.396306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.396336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.396521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.396993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.397030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.397060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.397317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.397502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.397513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.401015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.401070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.902 [2024-07-15 13:51:10.402027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.402059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.402294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.402334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.402362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.402809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.402840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.403174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.403186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.407651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.407691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.408152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.408184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.408401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.408445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.409434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.409467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.410444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.410659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.410670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.415042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.415905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.416903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.417891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.418170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.418216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.419086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.420066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.421057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.421244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.421254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.424843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.425664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.426645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.427631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.427931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.428957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.429884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.430885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.431944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.432325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.432337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.436689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.437666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.438638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.439469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.439690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.440538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.441536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.442572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.443188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.443379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.443390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.446208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.447193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.448184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.448610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.448796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.449876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.450991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.452021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.452665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.452885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.452896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.455590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.456580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.457160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.458245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.458434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.459434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.460409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.460760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.461631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.461966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.461978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.465255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.466178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.467008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.467825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.468017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.469022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.469630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.470657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.470927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.471262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.903 [2024-07-15 13:51:10.471275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.475980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.476417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.477233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.478209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.478394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.479435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.480062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.480641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.480902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.481121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.481132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.484656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.485651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.486635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.486983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.487175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.487450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.487709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.488651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.488913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.489207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.489219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.493141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.494135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.494813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.495766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.496087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.496363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.497271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.497585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.497852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.498048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.498060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.500912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.501795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.502486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.503508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.503843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.504127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.505068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.505371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.505641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.505831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.505845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.509235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.510253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.510528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.511631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.512016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:22.904 [2024-07-15 13:51:10.512295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.513443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.513710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.513981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.514308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.514320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.517213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.517493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.518102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.518738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.519068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.519624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.520322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.520589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.520858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.521145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.521157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.524531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.524809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.166 [2024-07-15 13:51:10.525687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.526067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.526397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.527206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.527652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.527920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.528200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.528492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.528504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.532603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.532885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.534051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.534334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.534673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.535754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.536027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.536289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.536557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.536842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.536853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.540769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.541045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.541979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.542243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.542534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.543528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.543787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.544054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.544323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.544607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.544618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.548144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.548547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.549348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.549609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.549869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.550726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.550987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.551250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.551511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.551775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.551786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.555054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.555505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.556257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.556517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.556767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.557576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.557835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.558099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.558363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.558595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.558606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.561783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.562369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.562986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.563246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.563465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.564168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.564430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.564690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.564958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.565180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.565191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.568171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.568736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.569374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.569635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.569860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.570612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.570873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.571136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.571401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.571647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.571658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.574884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.575363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.576087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.576348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.576581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.577421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.577685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.577945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.167 [2024-07-15 13:51:10.578214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.578459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.578472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.581851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.582196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.583056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.583317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.583609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.584646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.584908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.585174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.585444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.585742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.585753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.589628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.589912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.590889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.591154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.591451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.592567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.592828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.593092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.593375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.593652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.593663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.597902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.598198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.599241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.599506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.599829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.600663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.601028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.601288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.601552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.601829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.601840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.605689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.605959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.606914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.607199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.607535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.608394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.608746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.609008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.609274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.609582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.609594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.613889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.614451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.615256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.615631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.615816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.616095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.616356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.617464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.617731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.618058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.618070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.620224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.620494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.620764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.620798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.621154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.622290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.622581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.622859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.623793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.624146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.624158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.627953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.627992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.628255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.628285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.628557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.628831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.628866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.629591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.629624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.629924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.629936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.632580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.633573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.633627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.634087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.634281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.635394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.636454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.636488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.637349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.637604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.168 [2024-07-15 13:51:10.637615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.641079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.642096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.642132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.643140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.643430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.644485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.644525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.645607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.646658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.646845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.646856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.650020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.650876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.650910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.651928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.652122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.652169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.652744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.653808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.653838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.654029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.654040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.657335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.658043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.658074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.658333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.658539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.659364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.660359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.660392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.661377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.661634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.661645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.664856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.665126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.665159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.665784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.666000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.666276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.666307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.667091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.667123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.667340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.667351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.670542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.670576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.670619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.670650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.670834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.671331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.671363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.671391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.671419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.671707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.671718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.673834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.674018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.674030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.676933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.676967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.677786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.680718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.680752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.680780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.680808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.681075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.681119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.681160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.681191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.681219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.681405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.681416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.684681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.169 [2024-07-15 13:51:10.685040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.685052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.687893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.688078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.688089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.690629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.690671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.690700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.690728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.691058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.691099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.691129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.691158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.691187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.691413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.691424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.693968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.694579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.697973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.698007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.698191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.698202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.701842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.703642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.703677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.703704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.703732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.703931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.703979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.704013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.704042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.704070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.704259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.704270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.706933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.706967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.706998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.707731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.711833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.714584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.714619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.714658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.714687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.715030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.715069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.715100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.715130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.170 [2024-07-15 13:51:10.715160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.715411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.715422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.718946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.722891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.725962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.726152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.726164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.729795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.733930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.736992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.737006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.739832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.739867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.739896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.739924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.740121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.740161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.740189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.740217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.740254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.740444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.740455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.743412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.743451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.743482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.171 [2024-07-15 13:51:10.743509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.743738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.743781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.743812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.743843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.743874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.744242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.744255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.746887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.746932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.746960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.746989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.747196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.747243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.747273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.747302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.747335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.747542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.747555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.750889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.751084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.751097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.753610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.753651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.753680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.753709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.753894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.753938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.753969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.754003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.754033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.754319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.754333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.756455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.756498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.756534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.757548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.757742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.757787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.757818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.757847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.757881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.758075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.758086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.760827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.761109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.761142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.761883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.762127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.762176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.763203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.763237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.764261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.764514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.764525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.768122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.768173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.768442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.768472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.768799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.768837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.768867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.769768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.769801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.770024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.770036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.773124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.773183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.774122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.774154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.774454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.774491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.774758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.774788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.774817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.775165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.775178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.777666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.777701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.778534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.778566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.778756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.779796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.779830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.779859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.780555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.780859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.172 [2024-07-15 13:51:10.780870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.784641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.784697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.785134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.785168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.785385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.785443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.785476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.786476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.786511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.786694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.786705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.790030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.790086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.791107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.791142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.791336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.791381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.791873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.791904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.792824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.793019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.793030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.796476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.797315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.798312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.799305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.799525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.799574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.800641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.801607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.802644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.802827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.436 [2024-07-15 13:51:10.802837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.805838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.806829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.807688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.808546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.808795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.809808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.810804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.811353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.811625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.811941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.811953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.815465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.816483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.817479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.817741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.818050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.818319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.818579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.819319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.820139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.820322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.820333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.823804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.824888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.825162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.825430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.825613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.825887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.826163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.827082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.828097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.828282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.828293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.831696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.831966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.832256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.832529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.832755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.833590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.834602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.835613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.836220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.836406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.836417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.839348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.839734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.840553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.841555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.841740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.842848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.843334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.844148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.845162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.845347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.845358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.848404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.848673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.848932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.849195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.849519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.849791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.850060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.850323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.850584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.850954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.850966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.853357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.853645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.853916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.854181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.854447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.854714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.854977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.855245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.855509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.855819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.855831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.858314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.858581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.858843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.859113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.859445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.859715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.859976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.860237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.860504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.860804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.860816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.863217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.863487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.863748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.864014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.864319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.864595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.864854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.865117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.865382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.865647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.437 [2024-07-15 13:51:10.865658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.868127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.868396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.868657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.868918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.869244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.869516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.869782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.870047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.870307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.870676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.870687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.873148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.873418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.873679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.873936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.874251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.874518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.874781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.875050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.875313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.875642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.875653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.878298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.878568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.878836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.879105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.879462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.879732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.879992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.880261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.880528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.880798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.880809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.882780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.883055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.883316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.883575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.883899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.884174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.884460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.884729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.885000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.885348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.885359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.887321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.887583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.887844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.888127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.888398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.888672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.888929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.889191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.889454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.889768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.889778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.891825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.892094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.892356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.892613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.892932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.893208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.893469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.893732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.894000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.894324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.894336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.896347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.896608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.896864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.897129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.897311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.897588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.898632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.898899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.899165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.899451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.899462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.901394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.901682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.901960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.902227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.902500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.902775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.903038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.903308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.903572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.903838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.903849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.905889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.906171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.906436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.906709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.907009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.907287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.908118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.908936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.909930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.910120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.438 [2024-07-15 13:51:10.910132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.912234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.912531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.912792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.913051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.913384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.913902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.914725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.915707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.916696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.916906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.916917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.918578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.918860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.919121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.919380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.919681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.920683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.921796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.922862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.923820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.924054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.924065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.925395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.925673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.925933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.926202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.926388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.927255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.928237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.929219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.929711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.929916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.929927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.931357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.931620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.931877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.932790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.933014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.934016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.935047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.935604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.936614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.936804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.936815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.938457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.938720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.939344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.940173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.940357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.941357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.942174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.943129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.943986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.944175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.944186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.946130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.946485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.947323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.948309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.948493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.949600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.950220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.951042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.952031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.952214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.952224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.954099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.955155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.956243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.956283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.956468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.957536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.958099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.958904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.959888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.960077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.960088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.961984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.962022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.963078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.963107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.963294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.964288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.964328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.965383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.965414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.965639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.965649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.966973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.967239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.967273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.967543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.967863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.968971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.969973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.970014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.971036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.971221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.439 [2024-07-15 13:51:10.971232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.973173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.973438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.973469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.973729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.974093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.974362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.974394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.975311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.976319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.976503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.976514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.978627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.979504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.979538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.979799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.980135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.980174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.980434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.980694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.980724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.980908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.980919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.982046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.982884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.982916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.983893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.984081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.985138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.985406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.985438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.985714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.986039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.986050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.987362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.988351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.988385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.988854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.989052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.990182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.990217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.991210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.991241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.991428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.991439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.993237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.993285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.993315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.993342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.993523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.994575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.994616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.994647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.994674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.994855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.994866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.996710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.998943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.999137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:10.999149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:11.000330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:11.000364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:11.000392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.440 [2024-07-15 13:51:11.000420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.000605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.000649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.000678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.000706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.000735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.001029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.001041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.003824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.005648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.007882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.008070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.008082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.009832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.011627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.011658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.011686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.011715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.011926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.011970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.012003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.012031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.012061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.012265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.012276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.013844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.014033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.014045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.015722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.015754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.015784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.015815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.016100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.016145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.016174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.016202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.016229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.016449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.016460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.017978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.018013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.018201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.018212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.019792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.019823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.019851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.019880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.441 [2024-07-15 13:51:11.020228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.020266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.020296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.020328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.020357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.020545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.020556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.021700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.021730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.021761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.021789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.022004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.022049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.022077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.022105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.022133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.022318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.022329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.023935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.023982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.024748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.025886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.025916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.025962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.025993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.026183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.026226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.026256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.026290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.026319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.026508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.026519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.027962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.028882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.029944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.029982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.030605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.031964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.031999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.032772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.033834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.033864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.033891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.033919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.034193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.034238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.034269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.034296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.034323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.034546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.034557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.035835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.035867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.035895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.035924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.036245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.036281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.036314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.036343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.036373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.036727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.036739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.037884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.037914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.037945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.037972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.038213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.442 [2024-07-15 13:51:11.038258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.038287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.038322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.038351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.038541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.038553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.039700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.039730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.039757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.039788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.040128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.040168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.040198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.040227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.040256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.040609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.040621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.041855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.041885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.041914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.041951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.042132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.042168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.042203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.042233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.042262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.042473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.042483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.043642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.043689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.043738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.043766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.044105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.044142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.044173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.044202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.044232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.044498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.044510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.045908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.045942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.045973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.046984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.047277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.047323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.047357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.047386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.047414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.047606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.047617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.048805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.049080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.049112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.049374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.049707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.443 [2024-07-15 13:51:11.049749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.050658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.050691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.051597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.051787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.051798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.052972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.053011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.054946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.056496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.056527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.057529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.057567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.057863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.057910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.058896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.058928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.058955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.059144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.059155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.060776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.060808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.061081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.061113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.061292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.062296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.062335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.062366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.063375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.063565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.063576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.065586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.065637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.706 [2024-07-15 13:51:11.065910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.065941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.066262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.066300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.066339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.066602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.066631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.066922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.066933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.068423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.068457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.069277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.069308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.069489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.069534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.070523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.070554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.071052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.071394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.071409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.073880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.074917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.076024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.076642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.076853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.076899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.077890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.078871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.079556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.079854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.079865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.082281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.083286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.084269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.084537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.084718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.085651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.086675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.087693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.087961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.088323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.088335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.090208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.090470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.090730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.090990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.091232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.091508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.091769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.092031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.092296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.092642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.092654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.094802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.095070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.095339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.095610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.095964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.096238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.096501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.096762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.097030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.097335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.097347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.099317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.099581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.099845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.100127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.100465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.100743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.101019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.101287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.101550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.101901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.101914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.103773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.104054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.104316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.104608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.104923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.105212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.105479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.105750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.106019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.106311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.106322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.108371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.108639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.108901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.109162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.109463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.109732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.109997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.110286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.110556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.110898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.110910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.707 [2024-07-15 13:51:11.112883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.113148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.113410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.113674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.113942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.114224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.114487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.114744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.115005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.115317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.115328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.117246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.117508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.117771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.118036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.118378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.118647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.118906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.119172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.119436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.119778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.119790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.121701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.121980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.122242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.122502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.122814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.123094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.123359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.123620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.123889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.124241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.124253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.126155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.126425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.126686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.126952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.127269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.127537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.127796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.128056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.128315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.128568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.128579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.130713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.130981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.131247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.131504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.131864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.132137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.132411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.132677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.132939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.133224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.133235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.135100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.135358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.135615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.135877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.136159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.136433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.136694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.136953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.137214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.137434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.137445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.139116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.139383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.139647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.139913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.140201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.140474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.140735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.140993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.141261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.141560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.141571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.143607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.143905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.144171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.144432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.144730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.145003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.145262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.145523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.145783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.146119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.146131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.148131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.148396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.148663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.149017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.149203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.150208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.151276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.152365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.153004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.708 [2024-07-15 13:51:11.153221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.153232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.154667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.154929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.155190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.156249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.156458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.157464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.158450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.158874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.159748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.159930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.159940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.161555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.161837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.162453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.163277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.163460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.164467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.165306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.166186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.167011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.167194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.167205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.169014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.169348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.170209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.171195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.171378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.172486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.173100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.173925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.174913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.175099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.175110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.176913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.177843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.178682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.179670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.179855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.180374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.181357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.182478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.183532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.183716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.183727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.186107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.186957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.187949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.188939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.189162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.190098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.190919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.191909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.192900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.193169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.193182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.196101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.197158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.198155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.199043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.199268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.200099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.201091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.202078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.202667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.202998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.203013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.205400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.206415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.207450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.207950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.208156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.209245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.210242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.211196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.211459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.211781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.211792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.214211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.215206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.215914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.216912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.217147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.218153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.219141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.219556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.219819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.220134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.220146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.222716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.223726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.224410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.225234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.225417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.226429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.227187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.709 [2024-07-15 13:51:11.227451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.227712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.228019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.228031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.230297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.230832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.231841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.231885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.232069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.233097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.234197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.234464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.234726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.235024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.235035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.237389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.237424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.238042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.238076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.238261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.239140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.239172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.240211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.240251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.240436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.240447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.242575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.243433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.243468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.244478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.244665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.245510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.246430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.246465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.247401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.247588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.247599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.249410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.249851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.249885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.250739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.250929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.251974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.252011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.252837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.253803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.254024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.254036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.255541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.255813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.255852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.256152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.256343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.256390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.257481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.258649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.258684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.258874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.258884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.260060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.260926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.260958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.261230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.261565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.261841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.262109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.262151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.263189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.263381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.263393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.264534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.265634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.265666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.266684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.266873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.267161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.267196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.710 [2024-07-15 13:51:11.267462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.267492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.267767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.267778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.269146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.269176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.269204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.269232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.269415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.270005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.270040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.270069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.270098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.270323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.270334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.271513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.271545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.271574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.271604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.271944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.271981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.272015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.272045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.272085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.272440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.272452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.273652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.273697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.273733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.273766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.273953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.274000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.274032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.274060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.274088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.274292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.274303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.275432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.275478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.275517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.275557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.275949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.275988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.276021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.276049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.276081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.276362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.276374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.277904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.277938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.277966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.277997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.278183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.278228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.278257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.278285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.278313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.278611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.278623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.279686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.279717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.279748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.279785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.280055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.280095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.280127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.280156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.280183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.280518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.280530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.282623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.283756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.283788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.283815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.283842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.284081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.284125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.284154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.284192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.284232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.284633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.284645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.286192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.286241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.286275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.286303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.711 [2024-07-15 13:51:11.286494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.286550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.286579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.286607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.286635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.286811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.286823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.287983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.288699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.290845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.291032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.291043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.292880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.294757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.294788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.294821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.294848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.295034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.295078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.295112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.295141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.295169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.295355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.295366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.296862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.297049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.297064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.298839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.298871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.298900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.298928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.299118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.299162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.299190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.299217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.299251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.299438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.299449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.300998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.301028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.301217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.301229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.302861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.302894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.302924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.302953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.303289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.303332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.303362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.303393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.303421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.303640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.303651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.304771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.304817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.304849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.304877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.712 [2024-07-15 13:51:11.305099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.305145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.305175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.305204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.305233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.305423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.305434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.307787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.308930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.308965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.308992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.309566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.310993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.311888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.313704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.315906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.316131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.316144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.317905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.319505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.319537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.319566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.319596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.319938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.319976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.320015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.320044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.320073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.320257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.713 [2024-07-15 13:51:11.320269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.321412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.321443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.321475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.322823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.324520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.324945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.324977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.325797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.325985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.326037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.327021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.327053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.327699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.327885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.327896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.329021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.329065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.329331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.329363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.329704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.329760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.329794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.330057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.330088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.330374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.330385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.331413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.331444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.332248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.977 [2024-07-15 13:51:11.332280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.332495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.332539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.333560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.333593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.333622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.333814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.333825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.335623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.335655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.336372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.336404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.336621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.337580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.337613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.337641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.338621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.338844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.338855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.340512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.340552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.340816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.340851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.341180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.341220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.341254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.341521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.341553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.341878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.341889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.343799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.343851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.344124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.344161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.344441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.344489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.344750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.344783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.345047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.345323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.345334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.347209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.347474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.347735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.348006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.348279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.348327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.348600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.348862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.349131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.349469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.349481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.351413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.351678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.351945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.352218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.352590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.352860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.353129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.353391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.353665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.353950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.353961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.356177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.356447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.356712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.356975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.357355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.357634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.357909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.358196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.358459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.358754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.358765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.360660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.360929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.361196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.361456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.361728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.362004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.362267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.362524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.362783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.363093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.363104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.978 [2024-07-15 13:51:11.365026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.365293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.365563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.365826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.366170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.366440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.366699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.366958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.367227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.367486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.367497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.369478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.369751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.370015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.370274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.370606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.370875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.371141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.371407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.371671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.371938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.371949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.373863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.374130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.374390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.374651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.374927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.375206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.375471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.375731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.375993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.376282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.376297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.378185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.378451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.378718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.378992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.379338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.379610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.379872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.380135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.380402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.380747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.380760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.382742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.383017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.383281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.383543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.383862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.384141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.384407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.384665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.384922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.385269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.385282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.387212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.387482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.387744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.388014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.388280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.388551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.388810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.389075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.389341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.389628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.389639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.391591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.392531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.392956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.393719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.393997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.394268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.394528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.394787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.395051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.395292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.395303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.397204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.397469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.397733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.397992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.398304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.398572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.398833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.399103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.399378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.399705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.399717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.401677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.401943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.402214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.402476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.979 [2024-07-15 13:51:11.402782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.403065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.403343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.403603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.403866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.404209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.404222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.405724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.406628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.407615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.408599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.408787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.409066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.409329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.409590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.409857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.410094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.410106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.412155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.413007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.414000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.414981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.415257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.415534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.415797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.416063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.416471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.416657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.416669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.418595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.419594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.420576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.421360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.421619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.421893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.422158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.422419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.423328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.423540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.423551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.425589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.426595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.427669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.427960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.428284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.428560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.428822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.429350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.430188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.430372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.430383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.432445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.433441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.434090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.434353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.434673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.434946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.435211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.436218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.437136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.437323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.437334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.439568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.440609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.440871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.441135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.441374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.441648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.442273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.443097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.444128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.444319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.444341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.446452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.447031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.447303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.447564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.447912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.448188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.449311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.450317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.451408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.451631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.451642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.453836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.454124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.454397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.454664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.980 [2024-07-15 13:51:11.455010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.455734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.456590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.457607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.458592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.458839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.458851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.460399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.460674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.460938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.461204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.461531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.462626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.463653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.464749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.465806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.466110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.466121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.467468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.467733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.468000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.468261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.468449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.469281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.470266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.471249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.471686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.471872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.471883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.473231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.473496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.473761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.474199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.474384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.475471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.476542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.477544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.478266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.478491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.478502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.479986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.480257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.480519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.481564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.481790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.482800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.483793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.484229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.485095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.485281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.485292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.486909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.487206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.487719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.487753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.487973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.488978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.489964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.490776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.491698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.491913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.491924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.493512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.493554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.493818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.493851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.494159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.495142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.495181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.496166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.496198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.496387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.496399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.498524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.499272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.499307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.499580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.499925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.500213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.500480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.500514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.501125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.501388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.501400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.503393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.504405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.504439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.505427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.505742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.981 [2024-07-15 13:51:11.506031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.506067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.506334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.506606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.506946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.506959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.508446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.509383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.509420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.510429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.510619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.510667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.511599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.511868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.511904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.512236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.512249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.513734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.514752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.514785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.515793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.516088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.517157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.518308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.518349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.519368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.519560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.519572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.521308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.521932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.521965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.522804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.523005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.524033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.524067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.524687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.524727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.524915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.524930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.526943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.527276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.527290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.528903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.529126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.529139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.530391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.530441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.530472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.530503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.530828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.530867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.982 [2024-07-15 13:51:11.530898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.530932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.530976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.531325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.531340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.532966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.533002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.533195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.533206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.534975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.535239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.535251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.536689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.536736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.536768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.536796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.536979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.537030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.537061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.537088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.537117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.537373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.537386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.538944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.539265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.539280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.540684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.540717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.540745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.540772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.540974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.541024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.541054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.541089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.541123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.541309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.541321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.542497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.542528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.542556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.542587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.542887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.542932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.542975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.543020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.543051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.543402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.543414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.544961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.545595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.546818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.546850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.546879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.546906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.983 [2024-07-15 13:51:11.547091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.547140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.547169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.547197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.547232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.547491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.547503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.549942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.551801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.553935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.554128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.554140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.555902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.557528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.557560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.557589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.557616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.557917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.557964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.558004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.558032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.558059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.558246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.558256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.559790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.560043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.560056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.561836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.561881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.561911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.561939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.562170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.562215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.562244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.562271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.562299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.562482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.562492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.563671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.563706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.563743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.563770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.563947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.984 [2024-07-15 13:51:11.563999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.564029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.564056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.564084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.564320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.564333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.566620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.567813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.567844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.567872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.567899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.568080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.568124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.568153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.568180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.568208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.568390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.568401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.570837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.572620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.574862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.576633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.578990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.580160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.985 [2024-07-15 13:51:11.581202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.581241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.582243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.582428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.582478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.583466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.583501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.583764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.584095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.584108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.585717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.585749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.586035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.586070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.586401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.586440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.586473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.586741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.586779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.587121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.587134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.588908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.588941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.589219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.589252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.589582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.589639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.589914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.589951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.590001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.590298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:23.986 [2024-07-15 13:51:11.590310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.592241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.592276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.592549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.592592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.592962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.593243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.593278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.593308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.593585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.593895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.593907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.595833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.595873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.596137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.596166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.596446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.596500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.596530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.596787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.596822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.597087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.597099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.599053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.599093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.599355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.599389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.599708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.599748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.600014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.600050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.600306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.600569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.600581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.602491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.602759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.603029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.603292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.603614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.603658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.603919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.604187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.604460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.604733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.604744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.606907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.607189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.607476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.253 [2024-07-15 13:51:11.607749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.608083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.608365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.608647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.608914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.609191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.609513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.609525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.611413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.611679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.611952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.612220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.612483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.612756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.613038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.613303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.613569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.613887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.613899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.615606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.616710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.616980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.617248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.617505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.617779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.618854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.619131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.619396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.619578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.619590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.622296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.622580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.622844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.623965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.624300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.624575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.624837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.625113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.626215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.626560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.626572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.628420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.628693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.629761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.630036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.630354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.631286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.631558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.631817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.632091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.632365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.632378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.634112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.634377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.634656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.634922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.635113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.635425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.635686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.636758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.637033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.637366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.637377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.639110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.640215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.640490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.640752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.641053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.641342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.642466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.642733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.643007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.643189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.643200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.646156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.646437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.646729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.647644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.647979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.648259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.648525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.648787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.649729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.650071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.650084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.652567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.652916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.653942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.654432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.654620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.654899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.655311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.656093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.656357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.656668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.656680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.658658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.659379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.254 [2024-07-15 13:51:11.659642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.659905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.660172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.660685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.661377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.661640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.662362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.662577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.662589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.664932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.665203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.665869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.666404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.666720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.666992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.667264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.667528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.667789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.668038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.668050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.669951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.670223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.670486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.670747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.671010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.671285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.671546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.671806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.672073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.672257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.672268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.674495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.675587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.676643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.677615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.677870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.678146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.678409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.678669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.679368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.679607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.679618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.681574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.682576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.683565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.683907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.684256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.684526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.684794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.685075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.685986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.686196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.686208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.688378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.689361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.690116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.690382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.690699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.690972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.691239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.692168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.693011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.693197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.693209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.695370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.696398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.696668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.696930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.697216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.697492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.697988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.698807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.699800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.700006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.700020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.702237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.702775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.703050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.703312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.703659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.703931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.705045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.706101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.255 [2024-07-15 13:51:11.707217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.707406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.707420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.709556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.709823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.710092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.710355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.710679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.711400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.712226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.713211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.714185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.714421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.714433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.715860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.716134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.716397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.716659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.716942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.717841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.718821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.719797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.720663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.720904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.720917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.722314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.722583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.722844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.723119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.723304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.724141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.725132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.726114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.726579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.726776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.726789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.728256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.728523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.728786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.729620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.729842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.730853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.731848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.732449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.733535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.733719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.733731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.735376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.735646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.736073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.736897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.737086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.738189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.739183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.739894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.740705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.740889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.740900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.742649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.742919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.743976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.744954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.745146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.746155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.746571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.747463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.748446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.748629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.748641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.750392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.751105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.751950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.752960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.753154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.753976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.754953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.755813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.756827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.256 [2024-07-15 13:51:11.757019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.757031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.759042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.760008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.761069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.761106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.761295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.762396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.763080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.763921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.764946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.765142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.765155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.767089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.767129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.768100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.768141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.768330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.769446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.769487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.770463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.770499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.770747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.770762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.772218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.772487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.772524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.772787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.773115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.774219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.775277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.775321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.776358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.776548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.776561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.778607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.778896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.778932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.779203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.779541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.779816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.779848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.780939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.782026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.782219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.782231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.784447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.785296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.785332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.785592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.785913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.785952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.786216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.786476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.786509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.786689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.786700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.787857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.788674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.788708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.789692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.789877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.790788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.791055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.791089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.791346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.791696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.791708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.792985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.793972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.794008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.794517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.794736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.795803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.795836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.796823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.257 [2024-07-15 13:51:11.796857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.797055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.797067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.798952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.798985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.799022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.799049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.799226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.800221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.800263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.800294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.800322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.800499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.800512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.801704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.801736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.801763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.801790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.801965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.802012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.802040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.802067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.802095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.802469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.802480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.804913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.806807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.808701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.808739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.808769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.808796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.808974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.809020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.809054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.809081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.809109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.809291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.258 [2024-07-15 13:51:11.809302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.810817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.811001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.811013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.812714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.812746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.812774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.812805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.812981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.813027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.813055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.813091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.813119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.813300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.813312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.814901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.815084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.815095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.816604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.816636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.816666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.816694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.817007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.817048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.817077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.817107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.817136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.817320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.817331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.818914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.819115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.819127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.820762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.820806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.820834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.820861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.821184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.821225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.821254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.821282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.821310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.821565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.821579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.822704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.822737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.822766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.822798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.822978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.823032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.259 [2024-07-15 13:51:11.823061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.823097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.823127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.823312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.823323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.824694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.824725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.824755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.824783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.825108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.825147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.825177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.825205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.825233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.825553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.825565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.826764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.826795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.826825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.826853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.827068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.827111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.827139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.827166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.827197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.827375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.827385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.829963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.830572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.833968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.834279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.834292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.836744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.836779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.836806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.836833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.837067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.837119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.837147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.837175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.837201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.837381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.837392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.840977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.843933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.843984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.844017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.844049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.844349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.844400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.844430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.260 [2024-07-15 13:51:11.844459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.844487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.844794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.844807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.847916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.850911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.850947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.850978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.851690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.854944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.857860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.857899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.857927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.857953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.858193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.858240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.858268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.858296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.858324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.858500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.858511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.861743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.861783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.861825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.861855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.862199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.862239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.862268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.862296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.862324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.862632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.261 [2024-07-15 13:51:11.862644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.864991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.865862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.868983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.869319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.869331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.871641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.871917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.871950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.872225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.872525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.872577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.872843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.872874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.873152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.873468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.873480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.875673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.875712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.875981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.876027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.876313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.876361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.876405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.876684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.876717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.877030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.877043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.879323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.879361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.879629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.879659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.523 [2024-07-15 13:51:11.879981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.880026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.880297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.880337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.880378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.880674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.880687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.882686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.882733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.883017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.883058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.883370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.883645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.883676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.883707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.883973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.884306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.884318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.886279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.886318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.886585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.886624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.886924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.886972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.887017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.887283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.887318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.887649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.887661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.889591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.889632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.889899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.889929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.890233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.890272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.890537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.890567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.890836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.891078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.891090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.893248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.893524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.893795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.894065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.894417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.894461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.894730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.895009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.895279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.895642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.895655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.897653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.897928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.898203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.898470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.898776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.899064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.899349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.899618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.899883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.900216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.900229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.902132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.902405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.902674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.903634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.903903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.982596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.983235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.983281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.984094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.985885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.986176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.987057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.987895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.988135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.989122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.989164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.989647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.989690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.990635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.990679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.991656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.991854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.991865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.991875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.991885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.993642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.994345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.995181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.996181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.997323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.998177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:11.999006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:12.000011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:12.000202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:12.000213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:12.000223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:12.000233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:12.002100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.524 [2024-07-15 13:51:12.003225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.004221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.005258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.005925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.006904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.008025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.009099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.009290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.009301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.009311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.009320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.011443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.012282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.013294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.014310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.015126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.015965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.016982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.017993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.018218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.018229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.018240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.018250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.020933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.021766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.022777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.023755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.025057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.025934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.026949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.027960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.028342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.028354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.028364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.028374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.031300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.032405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.033554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.034690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.035954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.037066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.038144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.039188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.039510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.039522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.039536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.039546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.042019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.043042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.044056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.044909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.045988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.047029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.048040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.048711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.049025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.049036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.049047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.049056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.051452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.052478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.053489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.053945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.054969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.055942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.056970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.057251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.057587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.057600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.057611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.057621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.060155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.061284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.062449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.063134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.064383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.065397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.066293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.066561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.066895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.066907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.066918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.066928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.069370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.070158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.071038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.071865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.073073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.073823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.074097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.074364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.074706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.074717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.074728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.074738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.077023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.077862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.078280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.078550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.079158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.525 [2024-07-15 13:51:12.079428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.080453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.081624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.081830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.081841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.081851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.081865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.083411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.084299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.084333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.084601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.085770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.086058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.086092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.086839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.087072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.087083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.087093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.087103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.089233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.089935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.090207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.090475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.091104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.092211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.093173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.094115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.094476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.094488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.094498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.094510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.096116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.096401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.096671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.097655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.098318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.099181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.100129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.101016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.101312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.101324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.101334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.101345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.103915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.104808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.105683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.106462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.106988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.107265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.107527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.107794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.108072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.108265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.108276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.108286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.108295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.108305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.110253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.110545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.110814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.111087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.111358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.111704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.111748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.112662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.114372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.114663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.114699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.114965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.115287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.115328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.115598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.115628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.115895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.116175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.116187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.116197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.116207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.116217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.117889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.118165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.118211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.118486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.118749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.118798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.119762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.526 [2024-07-15 13:51:12.121498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.121763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.121794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.122057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.122351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.122401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.122669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.122703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.122963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.123311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.123324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.123335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.123345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.123356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.125028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.125326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.125356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.125622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.125939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.125979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.126888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.128970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.129267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.129303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.129578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.129915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.129958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.130884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.132591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.132880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.132912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.133184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.133440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.133489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.133755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.133785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.134056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.134409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.134420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.134430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.134441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.134451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.136084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.136357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.136399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.136670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.137012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.137056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.137331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.137366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.137639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.137987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.138004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.138016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.138026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.527 [2024-07-15 13:51:12.138037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.139749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.140034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.140070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.140339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.140612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.140651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.140918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.140948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.141223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.141499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.141510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.141520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.141530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.141541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.143277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.143567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.143603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.143871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.144205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.144255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.144523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.789 [2024-07-15 13:51:12.144565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.144832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.145171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.145186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.145197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.145206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.145216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.147050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.147337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.147368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.147639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.147916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.147967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.148874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.150550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.150835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.150866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.151136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.151459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.151499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.151766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.151801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.152083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.152383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.152394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.152405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.152415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.152425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.154200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.154487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.154522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.154789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.155132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.155174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.155443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.155474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.155743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.156018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.156030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.156040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.156050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.156059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.157766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.158059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.158101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.158376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.158660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.158710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.158978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.159015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.159283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.159608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.159623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.159635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.159647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.159658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.161081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.161367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.161399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.161665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.162004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.162046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.163051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.163084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.164008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.164285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.164297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.164307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.164317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.164327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.165742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.166032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.166071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.166334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.166625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.166673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.167546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.167582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.168183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.168377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.168389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.168399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.168409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.168424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.170244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.170959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.170992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.171025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.171336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.171385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.172523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.172569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.790 [2024-07-15 13:51:12.173581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.173862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.173874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.173884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.173894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.173904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.175759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.175809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.175837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.175865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.176058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.176106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.177345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.178962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.179769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.181654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.182016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.182029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.182040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.182050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.182061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.183880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.184075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.184087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.184097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.184107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.184118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.185939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.185987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.186603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.187723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.187764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.187809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.187837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.188394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.791 [2024-07-15 13:51:12.190832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.190843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.190853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.190862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.190872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.192027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.192060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.192324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.192540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.192552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.277726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.277799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.278419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.278463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.278713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.278743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.282169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.283061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.283095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.283139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.284853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.285102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.285132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.285462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.285473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.286771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.287756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.287788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.287828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.288251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.289086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.289277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.289288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.289298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.289308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.290370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.290404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.291274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.291309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.291611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.291622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.295163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.295789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.296944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.297929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.298123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.298134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.298144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.298154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.299177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.299211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.299484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.299750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.300025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.300037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.302391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.303376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.304146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.304976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.305168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.305179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.305189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.305198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.306224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.306872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.307150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.307417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.307787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.307804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.311126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.312243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.313305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.314332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.314642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.314654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.314664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.314674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.314952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.315223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.315491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.316434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.316655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.316666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.318568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.319589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.320591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.321041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.321416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.321428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.321439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.792 [2024-07-15 13:51:12.321449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.321729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.321999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.322458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.323269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.323458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.323470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.327419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.327712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.327984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.328255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.328580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.328592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.328603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.328613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.329613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.330491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.331498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.332506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.332792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.332803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.334165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.334451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.334716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.334982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.335236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.335248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.335258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.335267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.336119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.337130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.338139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.338830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.339021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.339032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.342170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.342460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.343464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.344545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.344828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.344839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.344849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.344859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.345751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.346761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.347719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.347989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.348299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.348311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.350783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.351707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.352562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.352598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.352817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.352829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.352839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.352848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.353866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.354406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.354677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.354710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.355019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.355033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.358680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.358740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.359547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.359819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.360818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.361020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.361032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.362225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.363273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.364337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.364381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.364678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.364690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.364701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.364711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.364987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.365024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.365291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.365322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.365667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.365681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.793 [2024-07-15 13:51:12.369205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.369475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.369509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.369769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.370127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.370141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.370152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.370162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.370515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.370554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.371478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.372240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.372486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.372497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.374176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.374236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.374505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.375575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.375904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.375916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.375927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.375937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.375983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.376771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.377767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.377803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.378103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.378116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.380326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.380616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.380884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.380914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.381255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.381268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.381279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.381289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.381570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.381843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.381878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.382151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.382441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.382453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.384194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.385226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.385261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.385945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.386262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.386274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.386285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.386297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.386575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.386609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.386876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.387152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.387506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.387518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.389976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.390985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.391261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.391534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.391570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.391901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.391914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.393703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.394916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.395193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.395230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.395501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.395827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.395839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.398367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.398668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.398702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.794 [2024-07-15 13:51:12.398971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.399265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.399277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.399287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.399297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.399583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.399618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.399887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.400160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.400492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.400505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.402451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.402486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.402751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.403956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.404324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:24.795 [2024-07-15 13:51:12.404337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.406580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.406857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.407138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.407176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.407477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.407489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.407499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.407509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.407786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.408062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.408095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.408367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.408651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.408663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.410597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.410917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.410963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.411243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.411584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.411596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.411611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.411622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.411897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.411930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.412202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.412472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.058 [2024-07-15 13:51:12.412736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.412747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.415199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.415239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.415504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.415771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.416769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.417110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.417123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.418772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.419064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.419332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.419369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.419720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.419732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.419743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.419753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.420042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.420318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.420371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.420645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.421005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.421019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.423527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.423798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.423832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.424966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.425234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.425556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.425570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.427639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.427675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.427939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.428215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.428581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.428592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.428602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.428612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.428656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.428916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.429180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.429215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.429551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.429565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.432588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.432857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.433124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.433156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.433439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.433450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.433460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.433469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.434240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.435256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.435288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.436086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.436301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.436312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.438153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.438706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.438740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.439261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.439477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.439488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.439498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.439507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.439988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.440030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.440289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.440565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.440906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.440922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.443934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.443974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.444240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.444923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.445123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.445135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.445145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.445154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.445200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.445763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.059 [2024-07-15 13:51:12.446829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.446859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.447048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.447059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.448611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.449523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.449787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.449820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.450123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.450134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.450144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.450153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.451071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.452001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.452037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.452654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.452892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.452904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.455753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.456205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.456245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.457044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.457229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.457240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.457249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.457259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.458366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.458400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.459097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.459608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.459933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.459947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.462136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.462173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.463132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.464098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.464346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.464357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.464367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.464376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.464422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.465317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.466329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.466360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.466545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.466555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.469411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.470222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.471190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.471222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.471406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.471420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.471430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.471439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.471886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.472788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.472823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.473785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.473970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.473980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.475883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.476555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.476588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.476617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.476955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.476966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.476976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.476989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.477031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.477620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.477653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.478145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.478376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.478386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.480899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.480936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.480986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.481018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.481199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.481210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.481220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.481232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.481275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.482240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.482272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.482300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.482523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.482534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.483533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.060 [2024-07-15 13:51:12.483571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.483599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.483635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.483911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.483921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.483931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.483940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.483979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.484011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.484039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.484067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.484361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.484373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.486756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.487791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.487821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.487851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.487879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.488635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.490954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.490989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.491710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.492816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.492847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.492873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.492902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.493679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.496982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.497029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.497070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.497100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.497128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.497311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.497323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.498460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.498494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.498524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.498784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.061 [2024-07-15 13:51:12.499661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.499672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.502949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.503917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.503950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.503984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.504233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.504245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.504255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.504264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.504304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.504967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.505001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.505260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.505467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.505477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.507840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.507876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.507905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.508866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.509145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.509157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.509166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.509176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.509221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.510164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.510198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.511159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.511346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.511357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.514309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.514343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.515135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.515168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.515354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.515365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.515374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.515384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.515424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.516387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.516418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.516445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.516804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.516815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.517857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.518444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.518476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.518503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.518768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.518780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.518793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.518802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.519081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.519112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.519149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.520242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.520589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.520600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.523908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.523949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.523976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.524936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.525890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.526231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.526244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.527820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.527857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.529010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.529050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.529235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.529246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.529256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.529266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.529310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.530264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.530298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.530327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.530518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.530529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.533113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.533390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.062 [2024-07-15 13:51:12.533423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.533452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.533646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.533658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.533668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.533677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.533957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.533988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.534023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.534826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.535051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.535062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.537011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.537049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.537077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.538688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.539035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.539047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.541411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.541446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.542098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.542140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.542332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.542344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.542353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.542363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.542406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.543494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.543537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.543567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.543755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.543766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.545195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.545840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.545872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.545900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.546198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.546210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.546220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.546230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.546510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.546541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.546571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.547632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.547825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.547836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.551933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.551974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.552722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.553008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.553039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.063 [2024-07-15 13:51:12.553366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.553377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.554424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.554455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.555018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.555051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.555287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.555298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.555308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.555317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.555359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.556368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.556401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.556429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.556617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.556629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.558832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.559250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.559287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.559315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.559542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.559553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.559563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.559572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.560599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.560632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.560660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.561674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.561930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.561941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.563537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.563590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.563620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.563887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.564815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.567360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.567396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.568358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.568390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.568578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.568592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.568601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.568611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.568661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.569231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.569262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.569290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.569554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.569565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.571265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.572987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.573019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.573807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.574003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.574014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.577507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.577547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.577575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.578410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.578598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.578609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.578619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.578631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.578674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.578703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.579717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.579754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.580015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.580027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.582886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.582931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.583199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.583229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.064 [2024-07-15 13:51:12.583415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.583426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.583435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.583445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.583483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.583747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.583777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.583806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.584115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.584127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.587188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.588165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.588198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.588225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.588412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.588422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.588432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.588441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.589067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.589109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.589142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.589402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.589699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.589710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.592577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.592616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.592644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.593451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.593640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.593651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.593660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.593669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.593711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.593739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.594693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.594725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.595020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.595031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.598796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.598830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.599624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.599656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.599841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.599852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.599861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.599870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.599913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.600880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.600912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.600940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.601200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.601215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.604626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.604894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.604925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.604952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.605279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.605290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.605301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.605311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.605947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.605979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.606012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.606829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.607028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.607040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.610509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.610549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.610577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.611353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.611681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.611694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.611704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.611714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.611751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.611780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.612434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.612466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.065 [2024-07-15 13:51:12.612716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.612727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.615129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.615168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.616118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.617941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.618225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.618236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.621063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.621884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.622280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.622541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.622728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.622739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.622748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.622758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.623095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.623129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.623389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.624340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.624530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.624541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.628107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.628389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.628652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.629392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.629616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.629627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.629636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.629646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.630557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.631302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.632237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.632972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.633165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.633176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.635511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.636603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.637587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.638541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.638823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.638833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.638843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.638853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.639914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.641016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.641284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.641546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.641815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.641826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.645077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.645909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.646756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.647137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.647473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.647485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.647499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.647509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.648402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.648717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.648978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.649924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.650117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.650128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.652501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.652792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.653349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.654013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.654311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.654323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.654332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.654341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.655098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.655359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.656020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.656565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.656888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.656900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.659391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.660228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.660735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.661431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.661623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.661634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.661644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.661653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.662099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.066 [2024-07-15 13:51:12.662364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.663353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.663620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.663957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.663969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.667514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.667797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.668068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.668335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.668658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.668670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.668682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.668692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.668968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.669247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.670019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.670503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.670834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.067 [2024-07-15 13:51:12.670846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.673825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.674105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.674374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.674417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.674723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.674736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.674746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.674755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.675049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.676156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.676421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.676458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.676760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.676771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.679913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.679952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.680215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.680478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.680807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.680818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.680828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.680837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.681365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.681400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.681961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.681991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.682326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.682339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.685943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.686224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.686264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.686564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.686598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.686792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.332 [2024-07-15 13:51:12.686805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.689956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.690234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.690270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.690530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.691420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.691700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.691711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.693802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.693843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.694123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.694394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.694718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.694730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.694741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.694750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.694796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.695074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.695403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.695436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.695629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.695641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.697554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.697829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.698973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.699011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.699273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.699464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.699475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.702108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.702402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.702437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.702704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.703050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.703063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.703074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.703085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.703366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.703406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.703677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.704022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.704208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.704219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.706396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.706437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.706714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.706980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.707981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.708193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.333 [2024-07-15 13:51:12.708204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.710809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.711085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.711350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.711382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.711714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.711726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.711736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.711746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.712015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.712277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.712315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.712577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.712856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.712867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.717264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.717537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.717574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.717835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.718991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.719220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.719231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.723100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.723147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.723410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.723671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.723992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.724826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.727898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.728174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.728577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.728610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.728803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.728814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.728824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.728834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.729111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.729551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.729590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.730194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.730441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.730452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.733516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.734478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.734513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.735330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.735590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.735601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.735611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.735620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.736700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.736740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.737005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.737331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.737516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.334 [2024-07-15 13:51:12.737527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.740430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.740472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.740735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.741252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.741444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.741455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.741465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.741475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.741517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.741779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.742311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.742345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.742531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.742545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.744581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.745618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.745892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.745926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.746254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.746266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.746276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.746296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.747295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.748271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.748307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.748907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.749126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.749139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.750948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.751747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.751783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.752587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.752772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.752783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.752793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.752802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.753786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.753820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.754248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.755119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.755307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.755318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.757798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.757840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.758104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.758862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.759079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.759090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.759100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.759109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.759152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.760116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.761080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.761112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.761378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.761390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.763539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.763808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.764313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.764347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.764554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.764565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.764575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.764585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.764859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.765609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.765642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.766438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.766683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.766695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.335 [2024-07-15 13:51:12.769479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.769813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.769847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.770112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.770307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.770318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.770328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.770337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.770796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.770842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.771653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.772036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.772372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.772383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.775046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.775092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.776054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.776949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.777176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.777187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.777197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.777206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.777246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.778189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.779218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.779251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.779439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.779451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.781404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.781959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.782798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.782832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.783025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.783037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.783051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.783061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.784091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.784819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.784851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.786013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.786211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.786223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.789248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.789527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.789562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.790097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.790316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.790329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.790339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.790349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.791380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.791414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.792419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.793182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.793378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.793390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.795922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.795973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.797076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.797345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.797675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.797690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.797700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.797710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.797751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.798612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.799613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.799648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.799840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.799852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.801843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.802601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.336 [2024-07-15 13:51:12.802872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.802908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.803242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.803255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.803264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.803274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.804422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.804707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.804740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.805267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.805485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.805497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.808802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.809835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.809870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.809898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.810836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.811034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.811045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.813799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.814661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.814694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.814723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.814930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.814941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.816907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.816945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.816972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.817682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.819704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.819739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.819777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.819806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.819986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.820343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.821823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.821857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.821887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.337 [2024-07-15 13:51:12.821916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.822710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.824810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.824845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.824872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.824903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.825471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.826907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.826942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.826970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.827679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.829712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.829749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.829782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.829825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.830372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.831871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.831907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.338 [2024-07-15 13:51:12.831938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.832560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.833522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.833707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.833717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.835617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.836892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.837271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.837303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.838021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.838363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.838375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.840860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.840899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.840927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.841728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.841913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.841924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.841933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.841943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.841986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.842943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.842974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.843415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.843729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.843742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.845223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.845259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.846379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.846412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.846604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.846615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.846625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.846635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.846678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.847588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.847623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.847652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.847838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.847849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.849999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.850872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.851691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.851921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.851932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.854873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.854912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.854940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.339 [2024-07-15 13:51:12.855901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.856789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.858544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.858578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.859535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.859566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.859797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.859809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.859819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.859828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.859871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.860985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.861024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.861062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.861247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.861258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.863019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.863681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.863712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.863741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.864072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.864085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.864096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.864106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.865005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.865038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.865073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.866055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.866242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.866253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.869991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.870027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.870895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.870927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.871258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.871269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.873174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.873209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.873798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.873844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.874031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.874043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.874053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.874062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.874105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.875163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.875209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.875241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.875425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.875436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.876856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.877132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.877165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.877194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.340 [2024-07-15 13:51:12.877385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.877396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.877405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.877415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.878270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.878303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.878330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.879449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.879642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.879653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.881612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.881653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.881683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.881949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.882682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.883015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.883028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.883986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.884796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.885808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.885841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.885869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.886064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.886076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.887485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.887751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.887786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.887814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.888006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.888018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.888028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.888038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.889133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.889182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.889210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.890169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.890356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.890367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.892959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.893002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.893263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.893293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.893588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.893598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.894493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.894535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.894956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.894987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.895175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.341 [2024-07-15 13:51:12.895187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.895197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.895207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.895258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.896226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.896258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.896286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.896470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.896480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.897841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.898117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.898148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.898177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.898364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.898375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.898384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.898394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.899256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.899289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.899320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.900430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.900641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.900652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.902627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.902665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.902692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.903858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.904200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.904212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.905211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.905244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.906957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.907338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.907350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.908556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.909534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.909570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.909607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.909791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.909802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.909811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.909821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.910466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.910502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.910531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.911306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.911495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.911506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.913013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.913054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.913085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.914014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.914200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.914210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.342 [2024-07-15 13:51:12.914220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.914229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.914274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.914302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.915395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.915436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.915742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.915754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.916796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.916831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.917099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.917137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.917463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.917476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.917487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.917497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.917533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.918274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.918308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.918336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.918520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.918531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.919457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.920966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.921000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.921260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.921594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.921609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.923287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.923327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.923355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.924773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.925119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.925131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.926312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.926380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.927438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.927847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.928038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.928058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.928068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.928077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.928950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.928985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.929249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.929283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.929558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.929570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.931311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.932057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.932721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.932984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.933252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.933266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.933276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.343 [2024-07-15 13:51:12.933285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.933560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.933596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.933859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.934130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.934428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.934439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.935840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.936117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.936381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.936644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.936898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.936909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.936919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.936928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.937910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.938880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.939550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.940377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.940566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.940578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.941890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.942173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.344 [2024-07-15 13:51:12.943270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.944229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.944420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.944432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.944442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.944451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.944877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.945841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.946925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.947962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.948260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.948272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.949699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.949974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.950244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.950515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.950834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.950846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.950856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.950866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.951155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.951433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.951698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.951962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.952276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.952288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.953685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.953959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.954240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.954510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.954846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.954858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.954868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.954877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.955157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.955436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.955709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.955980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.956277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.956294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.957690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.957967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.958244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.958513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.958757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.958769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.958780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.958790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.959076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.959354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.959632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.959925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.960291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.960303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.962066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.962343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.962612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.962886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.963231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.963243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.963254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.963263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.963547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.963818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.964095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.964366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.964643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.964655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.965938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.966247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.966528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.966576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.966908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.966919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.966930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.966940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.967228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.967508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.967783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.967823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.968082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.968095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.969460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.969499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.969769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.604 [2024-07-15 13:51:12.970060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.970382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.970394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.970404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.970414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.970689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.970729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.970992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.971034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.971386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.971398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.972588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.972863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.973846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.974120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.974157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.974500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.974511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.975937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.976210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.976247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.976510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.976796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.976808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.976818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.976828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.977109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.977151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.977417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.977679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.977939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.977950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.979188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.979231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.979492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.979755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.980095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.980107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.980117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.980129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.980177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.980441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.981452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.981491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.981676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.981687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.982563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.982831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.983103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.983139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.983358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.983370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.983379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.983389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.984081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.984810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.984843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.985902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.986127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.986139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.987514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.988623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.988659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.989877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.990182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.990371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.990382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.991569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.991606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.991868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.992905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.993100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.993112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.993121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.993131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.605 [2024-07-15 13:51:12.993178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.993670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.994378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.994413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.994596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.994609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.995715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.996830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.997893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.997934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.998269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.998281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.998290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.998300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:12.999299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:25.606 [2024-07-15 13:51:13.004197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:25.606 [2024-07-15 13:51:13.010734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:25.606 [2024-07-15 13:51:13.017225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:25.606 [2024-07-15 13:51:13.018639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:25.606 [2024-07-15 13:51:13.020303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:25.864 00:28:25.864 Latency(us) 00:28:25.864 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:25.864 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x0 length 0x100 00:28:25.864 crypto_ram : 5.55 60.14 3.76 0.00 0.00 2030102.92 48781.58 1677721.60 00:28:25.864 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x100 length 0x100 00:28:25.864 crypto_ram : 5.47 55.62 3.48 0.00 0.00 2193319.24 13563.10 1772549.34 00:28:25.864 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x0 length 0x100 00:28:25.864 crypto_ram1 : 5.58 65.06 4.07 0.00 0.00 1867935.03 47869.77 1546421.65 00:28:25.864 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x100 length 0x100 00:28:25.864 crypto_ram1 : 5.49 59.59 3.72 0.00 0.00 2019806.62 20173.69 1641249.39 00:28:25.864 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x0 length 0x100 00:28:25.864 crypto_ram2 : 5.40 420.10 26.26 0.00 0.00 282549.24 3262.55 434019.28 00:28:25.864 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x100 length 0x100 00:28:25.864 crypto_ram2 : 5.37 408.85 25.55 0.00 0.00 290428.37 9118.05 448608.17 00:28:25.864 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x0 length 0x100 00:28:25.864 crypto_ram3 : 5.46 432.65 27.04 0.00 0.00 269173.14 17780.20 335544.32 00:28:25.864 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:25.864 Verification LBA range: start 0x100 length 0x100 00:28:25.864 crypto_ram3 : 5.43 427.10 26.69 0.00 0.00 272903.61 4331.07 291777.67 00:28:25.864 =================================================================================================================== 00:28:25.864 Total : 1929.10 120.57 0.00 0.00 499373.98 3262.55 1772549.34 00:28:26.121 00:28:26.121 real 0m8.652s 00:28:26.121 user 0m16.489s 00:28:26.121 sys 0m0.422s 00:28:26.121 13:51:13 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:26.121 13:51:13 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:28:26.121 ************************************ 00:28:26.121 END TEST bdev_verify_big_io 00:28:26.121 ************************************ 00:28:26.379 13:51:13 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:28:26.379 13:51:13 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:26.379 13:51:13 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:26.379 13:51:13 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:26.379 13:51:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:26.379 ************************************ 00:28:26.379 START TEST bdev_write_zeroes 00:28:26.379 ************************************ 00:28:26.379 13:51:13 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:26.379 [2024-07-15 13:51:13.841457] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:26.380 [2024-07-15 13:51:13.841502] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid143852 ] 00:28:26.380 [2024-07-15 13:51:13.925890] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:26.638 [2024-07-15 13:51:14.010898] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:26.638 [2024-07-15 13:51:14.031826] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:26.638 [2024-07-15 13:51:14.039864] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:26.638 [2024-07-15 13:51:14.047876] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:26.638 [2024-07-15 13:51:14.145104] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:29.177 [2024-07-15 13:51:16.327493] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:29.177 [2024-07-15 13:51:16.327544] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:29.177 [2024-07-15 13:51:16.327555] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:29.177 [2024-07-15 13:51:16.335500] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:29.177 [2024-07-15 13:51:16.335513] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:29.177 [2024-07-15 13:51:16.335521] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:29.177 [2024-07-15 13:51:16.343520] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:29.177 [2024-07-15 13:51:16.343532] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:29.177 [2024-07-15 13:51:16.343540] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:29.177 [2024-07-15 13:51:16.351541] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:29.177 [2024-07-15 13:51:16.351553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:29.177 [2024-07-15 13:51:16.351560] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:29.177 Running I/O for 1 seconds... 00:28:30.147 00:28:30.147 Latency(us) 00:28:30.147 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:30.147 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:30.147 crypto_ram : 1.02 3092.63 12.08 0.00 0.00 41193.56 3732.70 50377.24 00:28:30.147 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:30.147 crypto_ram1 : 1.02 3098.18 12.10 0.00 0.00 40963.08 3732.70 46502.07 00:28:30.147 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:30.147 crypto_ram2 : 1.01 24132.83 94.27 0.00 0.00 5250.45 1624.15 7094.98 00:28:30.147 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:30.147 crypto_ram3 : 1.01 24165.44 94.40 0.00 0.00 5231.89 1624.15 5784.26 00:28:30.147 =================================================================================================================== 00:28:30.147 Total : 54489.07 212.85 0.00 0.00 9324.51 1624.15 50377.24 00:28:30.405 00:28:30.405 real 0m4.046s 00:28:30.405 user 0m3.685s 00:28:30.405 sys 0m0.325s 00:28:30.405 13:51:17 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:30.405 13:51:17 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:28:30.405 ************************************ 00:28:30.405 END TEST bdev_write_zeroes 00:28:30.405 ************************************ 00:28:30.405 13:51:17 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:28:30.405 13:51:17 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:30.405 13:51:17 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:30.405 13:51:17 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:30.405 13:51:17 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:30.405 ************************************ 00:28:30.405 START TEST bdev_json_nonenclosed 00:28:30.405 ************************************ 00:28:30.405 13:51:17 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:30.405 [2024-07-15 13:51:17.960305] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:30.405 [2024-07-15 13:51:17.960351] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144400 ] 00:28:30.664 [2024-07-15 13:51:18.043965] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:30.664 [2024-07-15 13:51:18.124921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:30.664 [2024-07-15 13:51:18.124978] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:28:30.664 [2024-07-15 13:51:18.124992] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:30.664 [2024-07-15 13:51:18.125008] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:30.664 00:28:30.664 real 0m0.297s 00:28:30.664 user 0m0.182s 00:28:30.664 sys 0m0.113s 00:28:30.664 13:51:18 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:28:30.664 13:51:18 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:30.664 13:51:18 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:28:30.664 ************************************ 00:28:30.664 END TEST bdev_json_nonenclosed 00:28:30.664 ************************************ 00:28:30.664 13:51:18 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:28:30.664 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:28:30.664 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:30.664 13:51:18 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:30.664 13:51:18 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:30.664 13:51:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:30.664 ************************************ 00:28:30.664 START TEST bdev_json_nonarray 00:28:30.664 ************************************ 00:28:30.664 13:51:18 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:30.923 [2024-07-15 13:51:18.321701] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:30.923 [2024-07-15 13:51:18.321745] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144426 ] 00:28:30.923 [2024-07-15 13:51:18.403282] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:30.923 [2024-07-15 13:51:18.485204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:30.923 [2024-07-15 13:51:18.485268] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:28:30.923 [2024-07-15 13:51:18.485284] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:30.923 [2024-07-15 13:51:18.485292] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:31.182 00:28:31.182 real 0m0.297s 00:28:31.182 user 0m0.180s 00:28:31.182 sys 0m0.114s 00:28:31.182 13:51:18 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:28:31.182 13:51:18 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:31.182 13:51:18 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:28:31.182 ************************************ 00:28:31.182 END TEST bdev_json_nonarray 00:28:31.182 ************************************ 00:28:31.182 13:51:18 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:28:31.182 13:51:18 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:28:31.182 00:28:31.182 real 1m8.351s 00:28:31.182 user 2m34.038s 00:28:31.182 sys 0m7.765s 00:28:31.182 13:51:18 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:31.182 13:51:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:31.182 ************************************ 00:28:31.182 END TEST blockdev_crypto_qat 00:28:31.182 ************************************ 00:28:31.182 13:51:18 -- common/autotest_common.sh@1142 -- # return 0 00:28:31.182 13:51:18 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:28:31.182 13:51:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:31.182 13:51:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:31.182 13:51:18 -- common/autotest_common.sh@10 -- # set +x 00:28:31.182 ************************************ 00:28:31.182 START TEST chaining 00:28:31.182 ************************************ 00:28:31.182 13:51:18 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:28:31.182 * Looking for test storage... 00:28:31.182 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:31.182 13:51:18 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@7 -- # uname -s 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00d40ca9-2a78-e711-906e-0017a4403562 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=00d40ca9-2a78-e711-906e-0017a4403562 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:28:31.182 13:51:18 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:31.182 13:51:18 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:31.182 13:51:18 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:31.182 13:51:18 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:31.182 13:51:18 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:31.182 13:51:18 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:31.182 13:51:18 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:31.182 13:51:18 chaining -- paths/export.sh@5 -- # export PATH 00:28:31.182 13:51:18 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@47 -- # : 0 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:28:31.183 13:51:18 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:28:31.183 13:51:18 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:28:31.183 13:51:18 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:28:31.183 13:51:18 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:28:31.183 13:51:18 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:28:31.183 13:51:18 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:31.183 13:51:18 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:31.183 13:51:18 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:28:31.183 13:51:18 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:28:31.183 13:51:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@296 -- # e810=() 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@297 -- # x722=() 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@298 -- # mlx=() 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@336 -- # return 1 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:28:39.350 WARNING: No supported devices were found, fallback requested for tcp test 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:28:39.350 Cannot find device "nvmf_tgt_br" 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@155 -- # true 00:28:39.350 13:51:25 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:28:39.350 Cannot find device "nvmf_tgt_br2" 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@156 -- # true 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:28:39.350 Cannot find device "nvmf_tgt_br" 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@158 -- # true 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:28:39.350 Cannot find device "nvmf_tgt_br2" 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@159 -- # true 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:28:39.350 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@162 -- # true 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:28:39.350 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@163 -- # true 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:28:39.350 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:28:39.350 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.070 ms 00:28:39.350 00:28:39.350 --- 10.0.0.2 ping statistics --- 00:28:39.350 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:39.350 rtt min/avg/max/mdev = 0.070/0.070/0.070/0.000 ms 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:28:39.350 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:28:39.350 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.039 ms 00:28:39.350 00:28:39.350 --- 10.0.0.3 ping statistics --- 00:28:39.350 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:39.350 rtt min/avg/max/mdev = 0.039/0.039/0.039/0.000 ms 00:28:39.350 13:51:26 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:28:39.350 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:28:39.351 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.045 ms 00:28:39.351 00:28:39.351 --- 10.0.0.1 ping statistics --- 00:28:39.351 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:39.351 rtt min/avg/max/mdev = 0.045/0.045/0.045/0.000 ms 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@433 -- # return 0 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:28:39.351 13:51:26 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@481 -- # nvmfpid=148095 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:28:39.351 13:51:26 chaining -- nvmf/common.sh@482 -- # waitforlisten 148095 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@829 -- # '[' -z 148095 ']' 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:39.351 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:39.351 13:51:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.351 [2024-07-15 13:51:26.463250] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:39.351 [2024-07-15 13:51:26.463303] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:39.351 [2024-07-15 13:51:26.553172] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:39.351 [2024-07-15 13:51:26.636306] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:28:39.351 [2024-07-15 13:51:26.636348] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:28:39.351 [2024-07-15 13:51:26.636357] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:28:39.351 [2024-07-15 13:51:26.636366] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:28:39.351 [2024-07-15 13:51:26.636372] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:28:39.351 [2024-07-15 13:51:26.636393] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:39.917 13:51:27 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.917 13:51:27 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@69 -- # mktemp 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.RmbCkTJSoI 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@69 -- # mktemp 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.c3QOcXNJfe 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.917 malloc0 00:28:39.917 true 00:28:39.917 true 00:28:39.917 [2024-07-15 13:51:27.348328] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:39.917 crypto0 00:28:39.917 [2024-07-15 13:51:27.356357] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:39.917 crypto1 00:28:39.917 [2024-07-15 13:51:27.364445] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:39.917 [2024-07-15 13:51:27.380665] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@85 -- # update_stats 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:39.917 13:51:27 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:39.917 13:51:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:40.176 13:51:27 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.176 13:51:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:40.176 13:51:27 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.RmbCkTJSoI bs=1K count=64 00:28:40.176 64+0 records in 00:28:40.176 64+0 records out 00:28:40.176 65536 bytes (66 kB, 64 KiB) copied, 0.00106259 s, 61.7 MB/s 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.RmbCkTJSoI --ob Nvme0n1 --bs 65536 --count 1 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@25 -- # local config 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:40.176 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:40.176 "subsystems": [ 00:28:40.176 { 00:28:40.176 "subsystem": "bdev", 00:28:40.176 "config": [ 00:28:40.176 { 00:28:40.176 "method": "bdev_nvme_attach_controller", 00:28:40.176 "params": { 00:28:40.176 "trtype": "tcp", 00:28:40.176 "adrfam": "IPv4", 00:28:40.176 "name": "Nvme0", 00:28:40.176 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:40.176 "traddr": "10.0.0.2", 00:28:40.176 "trsvcid": "4420" 00:28:40.176 } 00:28:40.176 }, 00:28:40.176 { 00:28:40.176 "method": "bdev_set_options", 00:28:40.176 "params": { 00:28:40.176 "bdev_auto_examine": false 00:28:40.176 } 00:28:40.176 } 00:28:40.176 ] 00:28:40.176 } 00:28:40.176 ] 00:28:40.176 }' 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:40.176 "subsystems": [ 00:28:40.176 { 00:28:40.176 "subsystem": "bdev", 00:28:40.176 "config": [ 00:28:40.176 { 00:28:40.176 "method": "bdev_nvme_attach_controller", 00:28:40.176 "params": { 00:28:40.176 "trtype": "tcp", 00:28:40.176 "adrfam": "IPv4", 00:28:40.176 "name": "Nvme0", 00:28:40.176 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:40.176 "traddr": "10.0.0.2", 00:28:40.176 "trsvcid": "4420" 00:28:40.176 } 00:28:40.176 }, 00:28:40.176 { 00:28:40.176 "method": "bdev_set_options", 00:28:40.176 "params": { 00:28:40.176 "bdev_auto_examine": false 00:28:40.176 } 00:28:40.176 } 00:28:40.176 ] 00:28:40.176 } 00:28:40.176 ] 00:28:40.176 }' 00:28:40.176 13:51:27 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.RmbCkTJSoI --ob Nvme0n1 --bs 65536 --count 1 00:28:40.176 [2024-07-15 13:51:27.678697] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:40.176 [2024-07-15 13:51:27.678746] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148314 ] 00:28:40.176 [2024-07-15 13:51:27.762845] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:40.434 [2024-07-15 13:51:27.847827] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:40.691  Copying: 64/64 [kB] (average 15 MBps) 00:28:40.691 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:40.691 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.691 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.691 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:40.691 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.691 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.691 13:51:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@96 -- # update_stats 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:40.949 13:51:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.949 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.950 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:40.950 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.950 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.950 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:40.950 13:51:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:40.950 13:51:28 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.950 13:51:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:40.950 13:51:28 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:41.208 13:51:28 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:41.208 13:51:28 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.c3QOcXNJfe --ib Nvme0n1 --bs 65536 --count 1 00:28:41.208 13:51:28 chaining -- bdev/chaining.sh@25 -- # local config 00:28:41.208 13:51:28 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:41.208 13:51:28 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:41.208 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:41.208 13:51:28 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:41.208 "subsystems": [ 00:28:41.208 { 00:28:41.208 "subsystem": "bdev", 00:28:41.208 "config": [ 00:28:41.208 { 00:28:41.208 "method": "bdev_nvme_attach_controller", 00:28:41.208 "params": { 00:28:41.208 "trtype": "tcp", 00:28:41.208 "adrfam": "IPv4", 00:28:41.208 "name": "Nvme0", 00:28:41.208 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:41.208 "traddr": "10.0.0.2", 00:28:41.208 "trsvcid": "4420" 00:28:41.208 } 00:28:41.208 }, 00:28:41.208 { 00:28:41.208 "method": "bdev_set_options", 00:28:41.209 "params": { 00:28:41.209 "bdev_auto_examine": false 00:28:41.209 } 00:28:41.209 } 00:28:41.209 ] 00:28:41.209 } 00:28:41.209 ] 00:28:41.209 }' 00:28:41.209 13:51:28 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.c3QOcXNJfe --ib Nvme0n1 --bs 65536 --count 1 00:28:41.209 13:51:28 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:41.209 "subsystems": [ 00:28:41.209 { 00:28:41.209 "subsystem": "bdev", 00:28:41.209 "config": [ 00:28:41.209 { 00:28:41.209 "method": "bdev_nvme_attach_controller", 00:28:41.209 "params": { 00:28:41.209 "trtype": "tcp", 00:28:41.209 "adrfam": "IPv4", 00:28:41.209 "name": "Nvme0", 00:28:41.209 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:41.209 "traddr": "10.0.0.2", 00:28:41.209 "trsvcid": "4420" 00:28:41.209 } 00:28:41.209 }, 00:28:41.209 { 00:28:41.209 "method": "bdev_set_options", 00:28:41.209 "params": { 00:28:41.209 "bdev_auto_examine": false 00:28:41.209 } 00:28:41.209 } 00:28:41.209 ] 00:28:41.209 } 00:28:41.209 ] 00:28:41.209 }' 00:28:41.209 [2024-07-15 13:51:28.665681] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:41.209 [2024-07-15 13:51:28.665727] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148524 ] 00:28:41.209 [2024-07-15 13:51:28.750225] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:41.468 [2024-07-15 13:51:28.833167] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:41.726  Copying: 64/64 [kB] (average 12 MBps) 00:28:41.726 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:41.726 13:51:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:41.726 13:51:29 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:41.984 13:51:29 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:41.984 13:51:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:41.984 13:51:29 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.RmbCkTJSoI /tmp/tmp.c3QOcXNJfe 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@25 -- # local config 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:41.984 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:41.984 "subsystems": [ 00:28:41.984 { 00:28:41.984 "subsystem": "bdev", 00:28:41.984 "config": [ 00:28:41.984 { 00:28:41.984 "method": "bdev_nvme_attach_controller", 00:28:41.984 "params": { 00:28:41.984 "trtype": "tcp", 00:28:41.984 "adrfam": "IPv4", 00:28:41.984 "name": "Nvme0", 00:28:41.984 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:41.984 "traddr": "10.0.0.2", 00:28:41.984 "trsvcid": "4420" 00:28:41.984 } 00:28:41.984 }, 00:28:41.984 { 00:28:41.984 "method": "bdev_set_options", 00:28:41.984 "params": { 00:28:41.984 "bdev_auto_examine": false 00:28:41.984 } 00:28:41.984 } 00:28:41.984 ] 00:28:41.984 } 00:28:41.984 ] 00:28:41.984 }' 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:28:41.984 13:51:29 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:41.984 "subsystems": [ 00:28:41.984 { 00:28:41.984 "subsystem": "bdev", 00:28:41.984 "config": [ 00:28:41.984 { 00:28:41.984 "method": "bdev_nvme_attach_controller", 00:28:41.984 "params": { 00:28:41.984 "trtype": "tcp", 00:28:41.984 "adrfam": "IPv4", 00:28:41.984 "name": "Nvme0", 00:28:41.984 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:41.984 "traddr": "10.0.0.2", 00:28:41.984 "trsvcid": "4420" 00:28:41.984 } 00:28:41.984 }, 00:28:41.984 { 00:28:41.984 "method": "bdev_set_options", 00:28:41.984 "params": { 00:28:41.984 "bdev_auto_examine": false 00:28:41.984 } 00:28:41.984 } 00:28:41.984 ] 00:28:41.984 } 00:28:41.984 ] 00:28:41.984 }' 00:28:41.984 [2024-07-15 13:51:29.493113] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:41.984 [2024-07-15 13:51:29.493164] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148558 ] 00:28:41.984 [2024-07-15 13:51:29.577549] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:42.242 [2024-07-15 13:51:29.661961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:42.500  Copying: 64/64 [kB] (average 15 MBps) 00:28:42.500 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@106 -- # update_stats 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:42.500 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:42.500 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:42.500 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:42.500 13:51:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:42.500 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:42.500 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:42.757 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:42.757 13:51:30 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:42.758 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:42.758 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:42.758 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:42.758 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:42.758 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:42.758 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.RmbCkTJSoI --ob Nvme0n1 --bs 4096 --count 16 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@25 -- # local config 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:42.758 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:42.758 "subsystems": [ 00:28:42.758 { 00:28:42.758 "subsystem": "bdev", 00:28:42.758 "config": [ 00:28:42.758 { 00:28:42.758 "method": "bdev_nvme_attach_controller", 00:28:42.758 "params": { 00:28:42.758 "trtype": "tcp", 00:28:42.758 "adrfam": "IPv4", 00:28:42.758 "name": "Nvme0", 00:28:42.758 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:42.758 "traddr": "10.0.0.2", 00:28:42.758 "trsvcid": "4420" 00:28:42.758 } 00:28:42.758 }, 00:28:42.758 { 00:28:42.758 "method": "bdev_set_options", 00:28:42.758 "params": { 00:28:42.758 "bdev_auto_examine": false 00:28:42.758 } 00:28:42.758 } 00:28:42.758 ] 00:28:42.758 } 00:28:42.758 ] 00:28:42.758 }' 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.RmbCkTJSoI --ob Nvme0n1 --bs 4096 --count 16 00:28:42.758 13:51:30 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:42.758 "subsystems": [ 00:28:42.758 { 00:28:42.758 "subsystem": "bdev", 00:28:42.758 "config": [ 00:28:42.758 { 00:28:42.758 "method": "bdev_nvme_attach_controller", 00:28:42.758 "params": { 00:28:42.758 "trtype": "tcp", 00:28:42.758 "adrfam": "IPv4", 00:28:42.758 "name": "Nvme0", 00:28:42.758 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:42.758 "traddr": "10.0.0.2", 00:28:42.758 "trsvcid": "4420" 00:28:42.758 } 00:28:42.758 }, 00:28:42.758 { 00:28:42.758 "method": "bdev_set_options", 00:28:42.758 "params": { 00:28:42.758 "bdev_auto_examine": false 00:28:42.758 } 00:28:42.758 } 00:28:42.758 ] 00:28:42.758 } 00:28:42.758 ] 00:28:42.758 }' 00:28:42.758 [2024-07-15 13:51:30.316849] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:42.758 [2024-07-15 13:51:30.316903] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148754 ] 00:28:43.015 [2024-07-15 13:51:30.403286] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:43.015 [2024-07-15 13:51:30.484962] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:43.273  Copying: 64/64 [kB] (average 12 MBps) 00:28:43.273 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:43.273 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.273 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.273 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.273 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:43.530 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:43.530 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:43.531 13:51:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.531 13:51:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@114 -- # update_stats 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:43.531 13:51:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:43.531 13:51:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:43.789 13:51:31 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:43.789 13:51:31 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:43.789 13:51:31 chaining -- bdev/chaining.sh@117 -- # : 00:28:43.789 13:51:31 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.c3QOcXNJfe --ib Nvme0n1 --bs 4096 --count 16 00:28:43.789 13:51:31 chaining -- bdev/chaining.sh@25 -- # local config 00:28:43.790 13:51:31 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:43.790 13:51:31 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:43.790 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:43.790 13:51:31 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:43.790 "subsystems": [ 00:28:43.790 { 00:28:43.790 "subsystem": "bdev", 00:28:43.790 "config": [ 00:28:43.790 { 00:28:43.790 "method": "bdev_nvme_attach_controller", 00:28:43.790 "params": { 00:28:43.790 "trtype": "tcp", 00:28:43.790 "adrfam": "IPv4", 00:28:43.790 "name": "Nvme0", 00:28:43.790 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:43.790 "traddr": "10.0.0.2", 00:28:43.790 "trsvcid": "4420" 00:28:43.790 } 00:28:43.790 }, 00:28:43.790 { 00:28:43.790 "method": "bdev_set_options", 00:28:43.790 "params": { 00:28:43.790 "bdev_auto_examine": false 00:28:43.790 } 00:28:43.790 } 00:28:43.790 ] 00:28:43.790 } 00:28:43.790 ] 00:28:43.790 }' 00:28:43.790 13:51:31 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.c3QOcXNJfe --ib Nvme0n1 --bs 4096 --count 16 00:28:43.790 13:51:31 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:43.790 "subsystems": [ 00:28:43.790 { 00:28:43.790 "subsystem": "bdev", 00:28:43.790 "config": [ 00:28:43.790 { 00:28:43.790 "method": "bdev_nvme_attach_controller", 00:28:43.790 "params": { 00:28:43.790 "trtype": "tcp", 00:28:43.790 "adrfam": "IPv4", 00:28:43.790 "name": "Nvme0", 00:28:43.790 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:43.790 "traddr": "10.0.0.2", 00:28:43.790 "trsvcid": "4420" 00:28:43.790 } 00:28:43.790 }, 00:28:43.790 { 00:28:43.790 "method": "bdev_set_options", 00:28:43.790 "params": { 00:28:43.790 "bdev_auto_examine": false 00:28:43.790 } 00:28:43.790 } 00:28:43.790 ] 00:28:43.790 } 00:28:43.790 ] 00:28:43.790 }' 00:28:43.790 [2024-07-15 13:51:31.264562] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:43.790 [2024-07-15 13:51:31.264615] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148909 ] 00:28:43.790 [2024-07-15 13:51:31.350734] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:44.048 [2024-07-15 13:51:31.435216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:44.306  Copying: 64/64 [kB] (average 1422 kBps) 00:28:44.306 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:44.306 13:51:31 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.306 13:51:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:44.306 13:51:31 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:44.306 13:51:31 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:44.564 13:51:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:44.564 13:51:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:44.564 13:51:31 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.564 13:51:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:44.564 13:51:31 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:44.565 13:51:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:44.565 13:51:31 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.565 13:51:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:44.565 13:51:31 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.RmbCkTJSoI /tmp/tmp.c3QOcXNJfe 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.RmbCkTJSoI /tmp/tmp.c3QOcXNJfe 00:28:44.565 13:51:32 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@117 -- # sync 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@120 -- # set +e 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:28:44.565 rmmod nvme_tcp 00:28:44.565 rmmod nvme_fabrics 00:28:44.565 rmmod nvme_keyring 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@124 -- # set -e 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@125 -- # return 0 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@489 -- # '[' -n 148095 ']' 00:28:44.565 13:51:32 chaining -- nvmf/common.sh@490 -- # killprocess 148095 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@948 -- # '[' -z 148095 ']' 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@952 -- # kill -0 148095 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@953 -- # uname 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 148095 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 148095' 00:28:44.565 killing process with pid 148095 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@967 -- # kill 148095 00:28:44.565 13:51:32 chaining -- common/autotest_common.sh@972 -- # wait 148095 00:28:44.823 13:51:32 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:28:44.823 13:51:32 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:28:44.823 13:51:32 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:28:44.823 13:51:32 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:28:44.823 13:51:32 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:28:44.823 13:51:32 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:44.823 13:51:32 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:44.823 13:51:32 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:45.080 13:51:32 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:28:45.080 13:51:32 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:28:45.080 13:51:32 chaining -- bdev/chaining.sh@132 -- # bperfpid=149059 00:28:45.080 13:51:32 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:45.080 13:51:32 chaining -- bdev/chaining.sh@134 -- # waitforlisten 149059 00:28:45.080 13:51:32 chaining -- common/autotest_common.sh@829 -- # '[' -z 149059 ']' 00:28:45.080 13:51:32 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:45.080 13:51:32 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:45.080 13:51:32 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:45.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:45.080 13:51:32 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:45.080 13:51:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:45.080 [2024-07-15 13:51:32.520811] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:45.080 [2024-07-15 13:51:32.520868] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149059 ] 00:28:45.080 [2024-07-15 13:51:32.610112] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.337 [2024-07-15 13:51:32.700137] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:45.902 13:51:33 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:45.902 13:51:33 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:45.902 13:51:33 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:28:45.902 13:51:33 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:45.902 13:51:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:45.902 malloc0 00:28:45.902 true 00:28:45.902 true 00:28:45.902 [2024-07-15 13:51:33.468220] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:45.902 crypto0 00:28:45.902 [2024-07-15 13:51:33.476246] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:45.902 crypto1 00:28:45.902 13:51:33 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:45.902 13:51:33 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:46.159 Running I/O for 5 seconds... 00:28:51.461 00:28:51.461 Latency(us) 00:28:51.462 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:51.462 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:51.462 Verification LBA range: start 0x0 length 0x2000 00:28:51.462 crypto1 : 5.01 18103.67 70.72 0.00 0.00 14109.43 4331.07 10542.75 00:28:51.462 =================================================================================================================== 00:28:51.462 Total : 18103.67 70.72 0.00 0.00 14109.43 4331.07 10542.75 00:28:51.462 0 00:28:51.462 13:51:38 chaining -- bdev/chaining.sh@146 -- # killprocess 149059 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@948 -- # '[' -z 149059 ']' 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@952 -- # kill -0 149059 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@953 -- # uname 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 149059 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 149059' 00:28:51.462 killing process with pid 149059 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@967 -- # kill 149059 00:28:51.462 Received shutdown signal, test time was about 5.000000 seconds 00:28:51.462 00:28:51.462 Latency(us) 00:28:51.462 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:51.462 =================================================================================================================== 00:28:51.462 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@972 -- # wait 149059 00:28:51.462 13:51:38 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:51.462 13:51:38 chaining -- bdev/chaining.sh@152 -- # bperfpid=149886 00:28:51.462 13:51:38 chaining -- bdev/chaining.sh@154 -- # waitforlisten 149886 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@829 -- # '[' -z 149886 ']' 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:51.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:51.462 13:51:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:51.462 [2024-07-15 13:51:38.850050] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:51.462 [2024-07-15 13:51:38.850110] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149886 ] 00:28:51.462 [2024-07-15 13:51:38.931722] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:51.462 [2024-07-15 13:51:39.019887] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:52.028 13:51:39 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:52.028 13:51:39 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:52.028 13:51:39 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:28:52.028 13:51:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:52.028 13:51:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:52.292 malloc0 00:28:52.292 true 00:28:52.292 true 00:28:52.292 [2024-07-15 13:51:39.769074] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:28:52.292 [2024-07-15 13:51:39.769114] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:52.292 [2024-07-15 13:51:39.769139] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2532320 00:28:52.292 [2024-07-15 13:51:39.769152] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:52.292 [2024-07-15 13:51:39.769938] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:52.292 [2024-07-15 13:51:39.769960] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:28:52.292 pt0 00:28:52.292 [2024-07-15 13:51:39.777103] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:52.292 crypto0 00:28:52.292 [2024-07-15 13:51:39.785122] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:52.292 crypto1 00:28:52.292 13:51:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:52.292 13:51:39 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:52.292 Running I/O for 5 seconds... 00:28:57.555 00:28:57.555 Latency(us) 00:28:57.555 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:57.555 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:57.555 Verification LBA range: start 0x0 length 0x2000 00:28:57.555 crypto1 : 5.01 14438.87 56.40 0.00 0.00 17686.61 4160.11 12765.27 00:28:57.555 =================================================================================================================== 00:28:57.555 Total : 14438.87 56.40 0.00 0.00 17686.61 4160.11 12765.27 00:28:57.555 0 00:28:57.555 13:51:44 chaining -- bdev/chaining.sh@167 -- # killprocess 149886 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@948 -- # '[' -z 149886 ']' 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@952 -- # kill -0 149886 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@953 -- # uname 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 149886 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 149886' 00:28:57.555 killing process with pid 149886 00:28:57.555 13:51:44 chaining -- common/autotest_common.sh@967 -- # kill 149886 00:28:57.555 Received shutdown signal, test time was about 5.000000 seconds 00:28:57.555 00:28:57.556 Latency(us) 00:28:57.556 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:57.556 =================================================================================================================== 00:28:57.556 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:57.556 13:51:44 chaining -- common/autotest_common.sh@972 -- # wait 149886 00:28:57.556 13:51:45 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:28:57.556 13:51:45 chaining -- bdev/chaining.sh@170 -- # killprocess 149886 00:28:57.556 13:51:45 chaining -- common/autotest_common.sh@948 -- # '[' -z 149886 ']' 00:28:57.556 13:51:45 chaining -- common/autotest_common.sh@952 -- # kill -0 149886 00:28:57.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (149886) - No such process 00:28:57.556 13:51:45 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 149886 is not found' 00:28:57.556 Process with pid 149886 is not found 00:28:57.556 13:51:45 chaining -- bdev/chaining.sh@171 -- # wait 149886 00:28:57.556 13:51:45 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:57.556 13:51:45 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:57.556 13:51:45 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:28:57.556 13:51:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@296 -- # e810=() 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@297 -- # x722=() 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@298 -- # mlx=() 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@336 -- # return 1 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:28:57.556 WARNING: No supported devices were found, fallback requested for tcp test 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:28:57.556 13:51:45 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:28:57.814 Cannot find device "nvmf_tgt_br" 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@155 -- # true 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:28:57.814 Cannot find device "nvmf_tgt_br2" 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@156 -- # true 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:28:57.814 Cannot find device "nvmf_tgt_br" 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@158 -- # true 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:28:57.814 Cannot find device "nvmf_tgt_br2" 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@159 -- # true 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:28:57.814 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@162 -- # true 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:28:57.814 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@163 -- # true 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:28:57.814 13:51:45 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:28:58.072 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:28:58.072 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.112 ms 00:28:58.072 00:28:58.072 --- 10.0.0.2 ping statistics --- 00:28:58.072 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:58.072 rtt min/avg/max/mdev = 0.112/0.112/0.112/0.000 ms 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:28:58.072 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:28:58.072 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.058 ms 00:28:58.072 00:28:58.072 --- 10.0.0.3 ping statistics --- 00:28:58.072 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:58.072 rtt min/avg/max/mdev = 0.058/0.058/0.058/0.000 ms 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:28:58.072 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:28:58.072 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.032 ms 00:28:58.072 00:28:58.072 --- 10.0.0.1 ping statistics --- 00:28:58.072 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:58.072 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@433 -- # return 0 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:28:58.072 13:51:45 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@481 -- # nvmfpid=151020 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@482 -- # waitforlisten 151020 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@829 -- # '[' -z 151020 ']' 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:58.072 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:58.072 13:51:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:58.072 13:51:45 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:28:58.072 [2024-07-15 13:51:45.632434] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:58.072 [2024-07-15 13:51:45.632486] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:58.330 [2024-07-15 13:51:45.725299] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:58.330 [2024-07-15 13:51:45.813824] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:28:58.330 [2024-07-15 13:51:45.813864] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:28:58.330 [2024-07-15 13:51:45.813873] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:28:58.330 [2024-07-15 13:51:45.813882] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:28:58.330 [2024-07-15 13:51:45.813889] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:28:58.330 [2024-07-15 13:51:45.813911] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:58.895 13:51:46 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:58.895 13:51:46 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:58.895 13:51:46 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:28:58.895 13:51:46 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:28:58.895 13:51:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:58.895 13:51:46 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:28:58.895 13:51:46 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:28:58.895 13:51:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:58.896 malloc0 00:28:58.896 [2024-07-15 13:51:46.478654] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:58.896 [2024-07-15 13:51:46.494850] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:58.896 13:51:46 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:28:58.896 13:51:46 chaining -- bdev/chaining.sh@189 -- # bperfpid=151189 00:28:58.896 13:51:46 chaining -- bdev/chaining.sh@191 -- # waitforlisten 151189 /var/tmp/bperf.sock 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@829 -- # '[' -z 151189 ']' 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:28:58.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:58.896 13:51:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:58.896 13:51:46 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:59.153 [2024-07-15 13:51:46.559265] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:28:59.153 [2024-07-15 13:51:46.559317] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151189 ] 00:28:59.153 [2024-07-15 13:51:46.645824] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:59.153 [2024-07-15 13:51:46.737857] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:00.085 13:51:47 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:00.085 13:51:47 chaining -- common/autotest_common.sh@862 -- # return 0 00:29:00.085 13:51:47 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:29:00.085 13:51:47 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:29:00.085 [2024-07-15 13:51:47.698642] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:29:00.341 nvme0n1 00:29:00.341 true 00:29:00.341 crypto0 00:29:00.341 13:51:47 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:29:00.341 Running I/O for 5 seconds... 00:29:05.594 00:29:05.594 Latency(us) 00:29:05.594 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:05.594 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:29:05.594 Verification LBA range: start 0x0 length 0x2000 00:29:05.594 crypto0 : 5.01 11693.05 45.68 0.00 0.00 21836.54 2635.69 18008.15 00:29:05.594 =================================================================================================================== 00:29:05.594 Total : 11693.05 45.68 0.00 0.00 21836.54 2635.69 18008.15 00:29:05.594 0 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:29:05.594 13:51:52 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@205 -- # sequence=117240 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:29:05.594 13:51:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@206 -- # encrypt=58620 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@207 -- # decrypt=58620 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:05.860 13:51:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:29:06.118 13:51:53 chaining -- bdev/chaining.sh@208 -- # crc32c=117240 00:29:06.118 13:51:53 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:29:06.118 13:51:53 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:29:06.118 13:51:53 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:29:06.118 13:51:53 chaining -- bdev/chaining.sh@214 -- # killprocess 151189 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@948 -- # '[' -z 151189 ']' 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@952 -- # kill -0 151189 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@953 -- # uname 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 151189 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 151189' 00:29:06.118 killing process with pid 151189 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@967 -- # kill 151189 00:29:06.118 Received shutdown signal, test time was about 5.000000 seconds 00:29:06.118 00:29:06.118 Latency(us) 00:29:06.118 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:06.118 =================================================================================================================== 00:29:06.118 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:06.118 13:51:53 chaining -- common/autotest_common.sh@972 -- # wait 151189 00:29:06.376 13:51:53 chaining -- bdev/chaining.sh@219 -- # bperfpid=152118 00:29:06.376 13:51:53 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:29:06.376 13:51:53 chaining -- bdev/chaining.sh@221 -- # waitforlisten 152118 /var/tmp/bperf.sock 00:29:06.376 13:51:53 chaining -- common/autotest_common.sh@829 -- # '[' -z 152118 ']' 00:29:06.376 13:51:53 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:29:06.376 13:51:53 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:06.376 13:51:53 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:29:06.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:29:06.376 13:51:53 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:06.376 13:51:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:06.376 [2024-07-15 13:51:53.905687] Starting SPDK v24.09-pre git sha1 9cede6267 / DPDK 24.03.0 initialization... 00:29:06.376 [2024-07-15 13:51:53.905748] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152118 ] 00:29:06.377 [2024-07-15 13:51:53.994221] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:06.634 [2024-07-15 13:51:54.075883] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:07.199 13:51:54 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:07.199 13:51:54 chaining -- common/autotest_common.sh@862 -- # return 0 00:29:07.199 13:51:54 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:29:07.199 13:51:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:29:07.456 [2024-07-15 13:51:55.050956] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:29:07.456 nvme0n1 00:29:07.456 true 00:29:07.456 crypto0 00:29:07.456 13:51:55 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:29:07.714 Running I/O for 5 seconds... 00:29:13.052 00:29:13.052 Latency(us) 00:29:13.052 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:13.052 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:29:13.052 Verification LBA range: start 0x0 length 0x200 00:29:13.052 crypto0 : 5.01 2414.94 150.93 0.00 0.00 13006.36 1460.31 13221.18 00:29:13.052 =================================================================================================================== 00:29:13.052 Total : 2414.94 150.93 0.00 0.00 13006.36 1460.31 13221.18 00:29:13.052 0 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@233 -- # sequence=24174 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@234 -- # encrypt=12087 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:29:13.052 13:52:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@235 -- # decrypt=12087 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:29:13.309 13:52:00 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:29:13.310 13:52:00 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:29:13.310 13:52:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:13.310 13:52:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:29:13.567 13:52:00 chaining -- bdev/chaining.sh@236 -- # crc32c=24174 00:29:13.567 13:52:00 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:29:13.567 13:52:00 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:29:13.567 13:52:00 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:29:13.567 13:52:00 chaining -- bdev/chaining.sh@242 -- # killprocess 152118 00:29:13.567 13:52:00 chaining -- common/autotest_common.sh@948 -- # '[' -z 152118 ']' 00:29:13.567 13:52:00 chaining -- common/autotest_common.sh@952 -- # kill -0 152118 00:29:13.567 13:52:00 chaining -- common/autotest_common.sh@953 -- # uname 00:29:13.567 13:52:00 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:13.567 13:52:00 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 152118 00:29:13.567 13:52:00 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:13.567 13:52:00 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:13.568 13:52:00 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 152118' 00:29:13.568 killing process with pid 152118 00:29:13.568 13:52:00 chaining -- common/autotest_common.sh@967 -- # kill 152118 00:29:13.568 Received shutdown signal, test time was about 5.000000 seconds 00:29:13.568 00:29:13.568 Latency(us) 00:29:13.568 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:13.568 =================================================================================================================== 00:29:13.568 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:13.568 13:52:00 chaining -- common/autotest_common.sh@972 -- # wait 152118 00:29:13.825 13:52:01 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@117 -- # sync 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@120 -- # set +e 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:29:13.825 rmmod nvme_tcp 00:29:13.825 rmmod nvme_fabrics 00:29:13.825 rmmod nvme_keyring 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@124 -- # set -e 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@125 -- # return 0 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@489 -- # '[' -n 151020 ']' 00:29:13.825 13:52:01 chaining -- nvmf/common.sh@490 -- # killprocess 151020 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@948 -- # '[' -z 151020 ']' 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@952 -- # kill -0 151020 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@953 -- # uname 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 151020 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 151020' 00:29:13.825 killing process with pid 151020 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@967 -- # kill 151020 00:29:13.825 13:52:01 chaining -- common/autotest_common.sh@972 -- # wait 151020 00:29:14.083 13:52:01 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:29:14.083 13:52:01 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:29:14.083 13:52:01 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:29:14.083 13:52:01 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:29:14.083 13:52:01 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:29:14.083 13:52:01 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:29:14.083 13:52:01 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:29:14.083 13:52:01 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:29:14.083 13:52:01 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:29:14.083 13:52:01 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:29:14.084 00:29:14.084 real 0m42.898s 00:29:14.084 user 0m54.023s 00:29:14.084 sys 0m12.132s 00:29:14.084 13:52:01 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:14.084 13:52:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:14.084 ************************************ 00:29:14.084 END TEST chaining 00:29:14.084 ************************************ 00:29:14.084 13:52:01 -- common/autotest_common.sh@1142 -- # return 0 00:29:14.084 13:52:01 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:29:14.084 13:52:01 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:29:14.084 13:52:01 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:29:14.084 13:52:01 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:29:14.084 13:52:01 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:29:14.084 13:52:01 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:29:14.084 13:52:01 -- common/autotest_common.sh@722 -- # xtrace_disable 00:29:14.084 13:52:01 -- common/autotest_common.sh@10 -- # set +x 00:29:14.084 13:52:01 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:29:14.084 13:52:01 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:29:14.084 13:52:01 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:29:14.084 13:52:01 -- common/autotest_common.sh@10 -- # set +x 00:29:19.346 INFO: APP EXITING 00:29:19.346 INFO: killing all VMs 00:29:19.346 INFO: killing vhost app 00:29:19.346 WARN: no vhost pid file found 00:29:19.346 INFO: EXIT DONE 00:29:22.685 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:29:22.685 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:29:22.685 Waiting for block devices as requested 00:29:22.685 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:29:22.685 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:29:22.685 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:29:22.685 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:29:22.685 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:29:22.685 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:29:22.685 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:29:22.945 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:29:22.945 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:29:22.945 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:29:22.945 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:29:23.204 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:29:23.204 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:29:23.204 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:29:23.463 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:29:23.463 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:29:23.463 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:29:27.655 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:29:27.655 0000:ae:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:ae:05.5 00:29:27.655 Cleaning 00:29:27.655 Removing: /var/run/dpdk/spdk0/config 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:29:27.655 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:29:27.655 Removing: /var/run/dpdk/spdk0/hugepage_info 00:29:27.655 Removing: /dev/shm/nvmf_trace.0 00:29:27.655 Removing: /dev/shm/spdk_tgt_trace.pid4129184 00:29:27.655 Removing: /var/run/dpdk/spdk0 00:29:27.655 Removing: /var/run/dpdk/spdk_pid101127 00:29:27.655 Removing: /var/run/dpdk/spdk_pid103021 00:29:27.655 Removing: /var/run/dpdk/spdk_pid106555 00:29:27.655 Removing: /var/run/dpdk/spdk_pid106861 00:29:27.655 Removing: /var/run/dpdk/spdk_pid107202 00:29:27.655 Removing: /var/run/dpdk/spdk_pid107568 00:29:27.655 Removing: /var/run/dpdk/spdk_pid108027 00:29:27.655 Removing: /var/run/dpdk/spdk_pid108716 00:29:27.655 Removing: /var/run/dpdk/spdk_pid109437 00:29:27.655 Removing: /var/run/dpdk/spdk_pid109718 00:29:27.655 Removing: /var/run/dpdk/spdk_pid110801 00:29:27.655 Removing: /var/run/dpdk/spdk_pid111884 00:29:27.655 Removing: /var/run/dpdk/spdk_pid112964 00:29:27.655 Removing: /var/run/dpdk/spdk_pid113753 00:29:27.655 Removing: /var/run/dpdk/spdk_pid114832 00:29:27.655 Removing: /var/run/dpdk/spdk_pid115909 00:29:27.655 Removing: /var/run/dpdk/spdk_pid117120 00:29:27.655 Removing: /var/run/dpdk/spdk_pid118284 00:29:27.655 Removing: /var/run/dpdk/spdk_pid118829 00:29:27.655 Removing: /var/run/dpdk/spdk_pid119203 00:29:27.655 Removing: /var/run/dpdk/spdk_pid121030 00:29:27.655 Removing: /var/run/dpdk/spdk_pid122886 00:29:27.655 Removing: /var/run/dpdk/spdk_pid124746 00:29:27.655 Removing: /var/run/dpdk/spdk_pid125812 00:29:27.655 Removing: /var/run/dpdk/spdk_pid126883 00:29:27.655 Removing: /var/run/dpdk/spdk_pid127427 00:29:27.655 Removing: /var/run/dpdk/spdk_pid127450 00:29:27.655 Removing: /var/run/dpdk/spdk_pid127682 00:29:27.655 Removing: /var/run/dpdk/spdk_pid127885 00:29:27.655 Removing: /var/run/dpdk/spdk_pid128007 00:29:27.655 Removing: /var/run/dpdk/spdk_pid128969 00:29:27.655 Removing: /var/run/dpdk/spdk_pid130478 00:29:27.655 Removing: /var/run/dpdk/spdk_pid131985 00:29:27.655 Removing: /var/run/dpdk/spdk_pid132708 00:29:27.655 Removing: /var/run/dpdk/spdk_pid133585 00:29:27.655 Removing: /var/run/dpdk/spdk_pid133780 00:29:27.655 Removing: /var/run/dpdk/spdk_pid133812 00:29:27.655 Removing: /var/run/dpdk/spdk_pid133872 00:29:27.655 Removing: /var/run/dpdk/spdk_pid134771 00:29:27.655 Removing: /var/run/dpdk/spdk_pid135316 00:29:27.655 Removing: /var/run/dpdk/spdk_pid135689 00:29:27.655 Removing: /var/run/dpdk/spdk_pid137512 00:29:27.655 Removing: /var/run/dpdk/spdk_pid139375 00:29:27.655 Removing: /var/run/dpdk/spdk_pid141220 00:29:27.655 Removing: /var/run/dpdk/spdk_pid142682 00:29:27.655 Removing: /var/run/dpdk/spdk_pid143852 00:29:27.655 Removing: /var/run/dpdk/spdk_pid144400 00:29:27.655 Removing: /var/run/dpdk/spdk_pid144426 00:29:27.655 Removing: /var/run/dpdk/spdk_pid148314 00:29:27.655 Removing: /var/run/dpdk/spdk_pid148524 00:29:27.655 Removing: /var/run/dpdk/spdk_pid148558 00:29:27.655 Removing: /var/run/dpdk/spdk_pid148754 00:29:27.655 Removing: /var/run/dpdk/spdk_pid148909 00:29:27.655 Removing: /var/run/dpdk/spdk_pid149059 00:29:27.655 Removing: /var/run/dpdk/spdk_pid149886 00:29:27.655 Removing: /var/run/dpdk/spdk_pid151189 00:29:27.655 Removing: /var/run/dpdk/spdk_pid152118 00:29:27.655 Removing: /var/run/dpdk/spdk_pid16931 00:29:27.655 Removing: /var/run/dpdk/spdk_pid19677 00:29:27.655 Removing: /var/run/dpdk/spdk_pid20483 00:29:27.655 Removing: /var/run/dpdk/spdk_pid29166 00:29:27.655 Removing: /var/run/dpdk/spdk_pid31094 00:29:27.655 Removing: /var/run/dpdk/spdk_pid32068 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4128414 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4129184 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4129686 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4130477 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4130701 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4131466 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4131507 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4131767 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4133336 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4134513 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4134740 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4134983 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4135268 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4135624 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4135820 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4136018 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4136247 00:29:27.914 Removing: /var/run/dpdk/spdk_pid4136832 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4139245 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4139454 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4139780 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4139998 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4140026 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4140249 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4140450 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4140647 00:29:27.915 Removing: /var/run/dpdk/spdk_pid41408 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4140844 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4141041 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4141234 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4141439 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4141635 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4141872 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4142134 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4142387 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4142584 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4142781 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4142976 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4143168 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4143370 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4143566 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4143794 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4144122 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4144451 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4144639 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4145123 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4145529 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4145790 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4146047 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4146359 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4146607 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4146928 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4147154 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4147355 00:29:27.915 Removing: /var/run/dpdk/spdk_pid4147614 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4148077 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4148340 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4148481 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4151757 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4153468 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4155159 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4155889 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4156952 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4157182 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4157348 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4157369 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4161164 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4161572 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4162604 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4162801 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4167278 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4168561 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4169362 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4173258 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4174566 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4175347 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4178820 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4180693 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4181462 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4189205 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4191003 00:29:28.174 Removing: /var/run/dpdk/spdk_pid4191820 00:29:28.174 Removing: /var/run/dpdk/spdk_pid43369 00:29:28.174 Removing: /var/run/dpdk/spdk_pid44291 00:29:28.174 Removing: /var/run/dpdk/spdk_pid53025 00:29:28.174 Removing: /var/run/dpdk/spdk_pid56187 00:29:28.174 Removing: /var/run/dpdk/spdk_pid57011 00:29:28.174 Removing: /var/run/dpdk/spdk_pid57978 00:29:28.174 Removing: /var/run/dpdk/spdk_pid60520 00:29:28.174 Removing: /var/run/dpdk/spdk_pid6061 00:29:28.174 Removing: /var/run/dpdk/spdk_pid65243 00:29:28.174 Removing: /var/run/dpdk/spdk_pid67446 00:29:28.174 Removing: /var/run/dpdk/spdk_pid71351 00:29:28.174 Removing: /var/run/dpdk/spdk_pid74248 00:29:28.174 Removing: /var/run/dpdk/spdk_pid78753 00:29:28.174 Removing: /var/run/dpdk/spdk_pid81146 00:29:28.174 Removing: /var/run/dpdk/spdk_pid8275 00:29:28.174 Removing: /var/run/dpdk/spdk_pid86491 00:29:28.174 Removing: /var/run/dpdk/spdk_pid88414 00:29:28.174 Removing: /var/run/dpdk/spdk_pid9089 00:29:28.174 Removing: /var/run/dpdk/spdk_pid94120 00:29:28.174 Removing: /var/run/dpdk/spdk_pid95996 00:29:28.174 Clean 00:29:28.433 13:52:15 -- common/autotest_common.sh@1451 -- # return 0 00:29:28.433 13:52:15 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:29:28.434 13:52:15 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:28.434 13:52:15 -- common/autotest_common.sh@10 -- # set +x 00:29:28.434 13:52:15 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:29:28.434 13:52:15 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:28.434 13:52:15 -- common/autotest_common.sh@10 -- # set +x 00:29:28.434 13:52:15 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:29:28.434 13:52:15 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:29:28.434 13:52:15 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:29:28.434 13:52:15 -- spdk/autotest.sh@391 -- # hash lcov 00:29:28.434 13:52:15 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:29:28.434 13:52:15 -- spdk/autotest.sh@393 -- # hostname 00:29:28.434 13:52:15 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-51 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:29:28.693 geninfo: WARNING: invalid characters removed from testname! 00:29:46.775 13:52:33 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:48.673 13:52:36 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:50.568 13:52:37 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:52.468 13:52:39 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:53.841 13:52:41 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:55.741 13:52:42 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:57.114 13:52:44 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:29:57.114 13:52:44 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:57.114 13:52:44 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:29:57.114 13:52:44 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:57.114 13:52:44 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:57.115 13:52:44 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.115 13:52:44 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.115 13:52:44 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.115 13:52:44 -- paths/export.sh@5 -- $ export PATH 00:29:57.115 13:52:44 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.115 13:52:44 -- common/autobuild_common.sh@472 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:57.115 13:52:44 -- common/autobuild_common.sh@473 -- $ date +%s 00:29:57.115 13:52:44 -- common/autobuild_common.sh@473 -- $ mktemp -dt spdk_1721044364.XXXXXX 00:29:57.115 13:52:44 -- common/autobuild_common.sh@473 -- $ SPDK_WORKSPACE=/tmp/spdk_1721044364.Iu77qg 00:29:57.115 13:52:44 -- common/autobuild_common.sh@475 -- $ [[ -n '' ]] 00:29:57.115 13:52:44 -- common/autobuild_common.sh@479 -- $ '[' -n '' ']' 00:29:57.115 13:52:44 -- common/autobuild_common.sh@482 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:29:57.115 13:52:44 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:29:57.115 13:52:44 -- common/autobuild_common.sh@488 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:29:57.115 13:52:44 -- common/autobuild_common.sh@489 -- $ get_config_params 00:29:57.115 13:52:44 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:29:57.115 13:52:44 -- common/autotest_common.sh@10 -- $ set +x 00:29:57.373 13:52:44 -- common/autobuild_common.sh@489 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:29:57.373 13:52:44 -- common/autobuild_common.sh@491 -- $ start_monitor_resources 00:29:57.373 13:52:44 -- pm/common@17 -- $ local monitor 00:29:57.373 13:52:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:57.373 13:52:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:57.373 13:52:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:57.373 13:52:44 -- pm/common@21 -- $ date +%s 00:29:57.373 13:52:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:57.373 13:52:44 -- pm/common@21 -- $ date +%s 00:29:57.373 13:52:44 -- pm/common@25 -- $ sleep 1 00:29:57.373 13:52:44 -- pm/common@21 -- $ date +%s 00:29:57.373 13:52:44 -- pm/common@21 -- $ date +%s 00:29:57.373 13:52:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.release_build.sh.1721044364 00:29:57.373 13:52:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.release_build.sh.1721044364 00:29:57.373 13:52:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.release_build.sh.1721044364 00:29:57.373 13:52:44 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.release_build.sh.1721044364 00:29:57.373 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.release_build.sh.1721044364_collect-vmstat.pm.log 00:29:57.373 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.release_build.sh.1721044364_collect-cpu-temp.pm.log 00:29:57.373 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.release_build.sh.1721044364_collect-cpu-load.pm.log 00:29:57.373 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.release_build.sh.1721044364_collect-bmc-pm.bmc.pm.log 00:29:58.307 13:52:45 -- common/autobuild_common.sh@492 -- $ trap stop_monitor_resources EXIT 00:29:58.307 13:52:45 -- spdk/release_build.sh@10 -- $ [[ 0 -eq 1 ]] 00:29:58.307 13:52:45 -- spdk/release_build.sh@1 -- $ stop_monitor_resources 00:29:58.307 13:52:45 -- pm/common@29 -- $ signal_monitor_resources TERM 00:29:58.307 13:52:45 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:29:58.307 13:52:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:58.307 13:52:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:29:58.307 13:52:45 -- pm/common@44 -- $ pid=162038 00:29:58.307 13:52:45 -- pm/common@50 -- $ kill -TERM 162038 00:29:58.307 13:52:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:58.307 13:52:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:29:58.307 13:52:45 -- pm/common@44 -- $ pid=162040 00:29:58.307 13:52:45 -- pm/common@50 -- $ kill -TERM 162040 00:29:58.307 13:52:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:58.307 13:52:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:29:58.307 13:52:45 -- pm/common@44 -- $ pid=162042 00:29:58.307 13:52:45 -- pm/common@50 -- $ kill -TERM 162042 00:29:58.307 13:52:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:58.307 13:52:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:29:58.307 13:52:45 -- pm/common@44 -- $ pid=162066 00:29:58.307 13:52:45 -- pm/common@50 -- $ sudo -E kill -TERM 162066 00:29:58.307 + [[ -n 4018129 ]] 00:29:58.307 + sudo kill 4018129 00:29:58.333 [Pipeline] } 00:29:58.351 [Pipeline] // stage 00:29:58.356 [Pipeline] } 00:29:58.374 [Pipeline] // timeout 00:29:58.380 [Pipeline] } 00:29:58.398 [Pipeline] // catchError 00:29:58.403 [Pipeline] } 00:29:58.421 [Pipeline] // wrap 00:29:58.428 [Pipeline] } 00:29:58.441 [Pipeline] // catchError 00:29:58.452 [Pipeline] stage 00:29:58.454 [Pipeline] { (Epilogue) 00:29:58.466 [Pipeline] catchError 00:29:58.468 [Pipeline] { 00:29:58.480 [Pipeline] echo 00:29:58.481 Cleanup processes 00:29:58.486 [Pipeline] sh 00:29:58.825 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:58.825 162149 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:29:58.825 162361 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:58.839 [Pipeline] sh 00:29:59.120 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:59.120 ++ grep -v 'sudo pgrep' 00:29:59.120 ++ awk '{print $1}' 00:29:59.120 + sudo kill -9 162149 00:29:59.132 [Pipeline] sh 00:29:59.412 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:30:07.523 [Pipeline] sh 00:30:07.797 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:30:07.797 Artifacts sizes are good 00:30:07.810 [Pipeline] archiveArtifacts 00:30:07.816 Archiving artifacts 00:30:07.920 [Pipeline] sh 00:30:08.201 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:30:08.215 [Pipeline] cleanWs 00:30:08.224 [WS-CLEANUP] Deleting project workspace... 00:30:08.224 [WS-CLEANUP] Deferred wipeout is used... 00:30:08.231 [WS-CLEANUP] done 00:30:08.232 [Pipeline] } 00:30:08.257 [Pipeline] // catchError 00:30:08.271 [Pipeline] sh 00:30:08.555 + logger -p user.info -t JENKINS-CI 00:30:08.564 [Pipeline] } 00:30:08.581 [Pipeline] // stage 00:30:08.587 [Pipeline] } 00:30:08.606 [Pipeline] // node 00:30:08.612 [Pipeline] End of Pipeline 00:30:08.645 Finished: SUCCESS